import ee
import geemap Lab 4: Data access and wrangling
Up until this point, you have been working with data prepared specifically for this course. However, downloading and unzipping individual datasets migth not be the most efficient way to access data in some of your future projects. You need to be able to automate the data download process using Python. In this lab we will leanr how to programmatically access data through APIs (Applied Programming Interfaces). More specifically, we will look at the APIs below:
- Google Earth Engine
- OpenStreetMap
- Christchurch City Council open data portal
A working directory is the folder where you want to load the data from and save the data to. By default, it is the same folder where the .ipynb file is. Run print(os.getcwd()) to print current working directory.
If necessary, you can change the working directory using os.chdir().
In this handout, it is assumed that you put the “Lab4_data” folder in the same folder with the .ipynb file
1 APIs and data wrangling
An API is a set of rules and protocols that define how software applications can interact with each other. In the context of GIS, an API provides a standardized way for developers to access and manipulate geospatial data from different sources, such as mapping services, geospatial databases, or data providers. For example, a geospatial data API might provide endpoints (specific web URLs for interacting with the API) for searching for locations, retrieving maps or satellite imagery, or performing geospatial analysis on data. APIs can use a variety of data formats, such as JSON, XML, or CSV, to represent data in a structured way that can be easily processed by other software applications. Many APIs also require authentication, either through API keys or OAuth tokens, to ensure that only authorized users or applications can access the data or functionality provided by the API. Think of an API as a bridge between two software components. It defines how requests for specific functionalities or data should be made and how responses will be structured. APIs are commonly used to facilitate integration between different systems, enabling them to work together.
Data wrangling, also known as data munging or data preprocessing, refers to the process of cleaning, structuring, and transforming raw data into a usable format for analysis, visualization, and other data-related tasks. Raw data often comes from various sources, including APIs, in different formats, and it may contain inconsistencies, errors, missing values, and other issues that need to be addressed before meaningful analysis can be performed.
Data wrangling involves some key steps:
- Data Collection: Gathering data from various sources, which could include databases, files, APIs, web scraping, and more.
- Data Cleaning: Identifying and correcting errors, inconsistencies, and inaccuracies in the data. This might involve removing duplicate records, handling missing values, and correcting typos.
- Data Transformation: Converting data into a common format or structure that is suitable for analysis. This can involve reshaping data, aggregating information, and creating new variables.
- Data Enrichment: Adding additional information to the data to enhance its value. This could involve merging data from multiple sources or appending external data.
- Data Formatting: Ensuring that data is in a consistent and standardized format. This might involve converting data types, formatting dates, and ensuring consistent units.
- Data Integration: Combining data from multiple sources to create a unified dataset that can be analyzed together.
- Data Validation and Quality Check: Ensuring that the processed data meets certain quality standards and is accurate and reliable.
- Data Exploration: Exploring the data to understand its characteristics, relationships, and patterns before formal analysis.
2 Google Earth Engine Python API
Google Earth Engine (GEE) combines a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Scientists, researchers, and developers use Earth Engine to detect changes, map trends, and quantify differences on the Earth’s surface. Earth Engine is now available for commercial use, and remains free for academic and research use. GEE is a cloud-based platform for analyzing and visualizing satellite imagery and other remote sensing products by writing code in JavaScript or Python.
2.1 Registering for a GEE account
We need a Google Account and a Google Earth Engine Account to access GEE. Create your free Google Account here. MAKE SURE IT IS ASSOCIATED WITH YOUR UC EMAIL by using your UC email as shown in the figure below:.
Once you have created your Google Account, go to the Earth Engine sign up by clicking here and sign in with your UC Google Account. The sign up page shown below will be presented, fill it with your information and the data shown in the figure below. Paste the following text under “What would you like to accomplish with Earth Engine?”: “We are using GEE in class for our GISC412|GEOG324 course. The aim is learning new ways to access geospatial data.” Check the box to agree with the terms of service and check the recaptcha to prove that you are not a robot. Click on the “SUBMIT” button
A “thank you” page will be presented. You can close it. Go to your UC email inbox and look for an email from earthengine-noreply@google.com with the subject “Welcome to Google Earth Engine!”. If you have not received it, check the spam. If you still have not found it, do not worry. The GEE registration is usually instantaneous for emails with an academic domain (.ac.nz) but it can take longer. Just move to the second section of this handout and come back to this here once you have received an email like the one in the figure below.
2.2 Authenticating GEE
We will use two main Python libraries to access GEE: ee and geemap. The ee library is the official Python API for Google Earth Engine. geemap is a user-friendly Python library built on top of the Earth Engine Python API (ee). It provides simplified functions and tools for mapping, data visualization, and analysis. geemap abstracts some of the complexities of the Earth Engine API, making it more accessible to Python users who may not be as familiar with JavaScript programming. It offers interactive map displays, simplified data extraction, and visualization functionalities. You find the eemanual here and the geemap manual here.
Let’s start by importing both libraries:
Now we need to use ee to authenticate our access to Google Earth Engine and initialize the API connection. Once you run the code cell below, a new webpage will open to get you to sign in with your UC Google Account.
ee.Authenticate()
ee.Initialize()The “Notebook Authenticator” page will open. As this is your first access, we need to create a “Cloud Project” by clicking on “CHOOSE PROJECT”.
Choose “Create a new Cloud Project”, define your project ID as “ee-gisc-your-uc-username”, eg. “ee-gisc-vda47” and click on the “Select” button:
Back to the “Notebook Authenticator” page, click on “GENERATE TOKEN”. The page “Choose an account to continue” will be loaded, choose your UC Google Account. Next the page “Google hasn’t verified this app” will be displayed, click on “Continue”. On the next page “Earth Engine Notebook Client - your-username@canterbury.ac.nz wants access to your Google Account” make sure you choose “Select all” like you see in the figure below, and click on “Continue”.
The token page is finally displayed, copy the token (use the button indicated by the red square in the figure below) and paste on the “Enter verification code:” field that appeared under your ee.Autehnticate() code cell and press “Enter” on your keyboard. You should see the following message after pressing enter: “Successfully saved authorization token.”
2.3 Retrieving datasets
GEE operates with three primary objects: Images, Image Collections and Features. An ee.Image represents a single image or raster dataset, while an ee.ImageCollection is a collection of images that share similar properties (such as spatial resolution, date range, and metadata), whiole ee.FeatureCollection and ee.Feature work with vector data.
2.3.1 ‘ee.Image’ | NASA SRTM Digital Elevation 30m product
Now let’s look at how we can retrieve and analyse satellite images using Google Earth Engine Python API. We will retrieve the NASA SRTM Digital Elevation 30m product. Each GEE dataset has a unique identifier known as an Earth Engine asset ID that must be used to retrieve it. You can find here the Earth Engine asset ID and metadata for all products available in GEE.
# "USGS/SRTMGL1_003" is the Earth Engine asset ID for elevation data captured by the SRTM mission.
# ee.Image restrieves the image from tha specified Earth Engine asset ID
srtm=ee.Image("USGS/SRTMGL1_003")
#Inspect the result
srtm- type:Image
- id:USGS/SRTMGL1_003
- version:1641990767055141
- id:elevation
- crs:EPSG:4326
- 0:0.0002777777777777778
- 1:0
- 2:-180.0001388888889
- 3:0
- 4:-0.0002777777777777778
- 5:60.00013888888889
- type:PixelType
- max:32767
- min:-32768
- precision:int
- 0:1296001
- 1:417601
- 0:950227200000
- 1:951177600000
- description:
The Shuttle Radar Topography Mission (SRTM, see Farr et al. 2007) digital elevation data is an international research effort that obtained digital elevation models on a near-global scale. This SRTM V3 product (SRTM Plus) is provided by NASA JPL at a resolution of 1 arc-second (approximately 30m).
This dataset has undergone a void-filling process using open-source data (ASTER GDEM2, GMTED2010, and NED), as opposed to other versions that contain voids or have been void-filled with commercial sources. For more information on the different versions see the SRTM Quick Guide.
Documentation:
Provider: NASA / USGS / JPL-Caltech
Bands
Name Description elevation Elevation
Terms of Use
Unless otherwise noted, images and video on JPL public web sites (public sites ending with a jpl.nasa.gov address) may be used for any purpose without prior permission. For more information and exceptions visit the JPL Image Use Policy site.
Suggested citation(s)
Farr, T.G., Rosen, P.A., Caro, E., Crippen, R., Duren, R., Hensley, S., Kobrick, M., Paller, M., Rodriguez, E., Roth, L., Seal, D., Shaffer, S., Shimada, J., Umland, J., Werner, M., Oskin, M., Burbank, D., and Alsdorf, D.E., 2007, The shuttle radar topography mission: Reviews of Geophysics, v. 45, no. 2, RG2004, at https://doi.org/10.1029/2005RG000183.
- 0:dem
- 1:elevation
- 2:geophysical
- 3:nasa
- 4:srtm
- 5:topography
- 6:usgs
- period:0
- 0:srtm
- 1:elevation
- 2:topography
- 3:dem
- 4:geophysical
- provider:NASA / USGS / JPL-Caltech
- provider_url:https://cmr.earthdata.nasa.gov/search/concepts/C1000000240-LPDAAC_ECS.html
- sample:https://mw1.google.com/ges/dd/images/SRTM90_V4_sample.png
- 0:nasa
- 1:usgs
- system:asset_size:132792638252
- system:visualization_0_bands:elevation
- system:visualization_0_gamma:1.6
- system:visualization_0_max:6000.0
- system:visualization_0_min:0.0
- system:visualization_0_name:Elevation
- 0:dem
- 1:elevation
- 2:geophysical
- 3:nasa
- 4:srtm
- 5:topography
- 6:usgs
- thumb:https://mw1.google.com/ges/dd/images/SRTM90_V4_thumb.png
- title:NASA SRTM Digital Elevation 30m
- type_name:Image
- visualization_0_bands:elevation
- visualization_0_gamma:1.6
- visualization_0_max:6000.0
- visualization_0_min:0.0
- visualization_0_name:Elevation
We can check if we have the correct GEE asset ID by retrieving the collection name (title) by combining the dictionary methods .get() and the function .getInfo():
srtm.get("title").getInfo()'NASA SRTM Digital Elevation 30m'
We can use .projection().crs() to get information on the coordinate reference system of an ee.Image object:
#he .projection() method provides information about the spatial characteristics of the image, including the CRS.
# .crs() retrieves ons crs info from the result of .projection()
srtm.projection().crs()- EPSG:4326
We can use .projection().nominalScale to get information on the pixel size of an ee.Image object:
srtm.projection().nominalScale()- 30.922080775909325
We can also retrieve metadata information from an image using the .getInfo() method. The .getInfo() method returns the metadata of an ee.Image in a structured format that you can then explore and use. Int he code below, the metadata variable will contain the metadata information in a Python dictionary format:
metadata= srtm.getInfo()
#Inpect the result
metadata{'type': 'Image',
'bands': [{'id': 'elevation',
'data_type': {'type': 'PixelType',
'precision': 'int',
'min': -32768,
'max': 32767},
'dimensions': [1296001, 417601],
'crs': 'EPSG:4326',
'crs_transform': [0.0002777777777777778,
0,
-180.0001388888889,
0,
-0.0002777777777777778,
60.00013888888889]}],
'id': 'USGS/SRTMGL1_003',
'version': 1641990767055141,
'properties': {'system:visualization_0_min': '0.0',
'type_name': 'Image',
'keywords': ['dem',
'elevation',
'geophysical',
'nasa',
'srtm',
'topography',
'usgs'],
'thumb': 'https://mw1.google.com/ges/dd/images/SRTM90_V4_thumb.png',
'description': '<p>The Shuttle Radar Topography Mission (SRTM, see <a href="https://onlinelibrary.wiley.com/doi/10.1029/2005RG000183/full">Farr\net al. 2007</a>)\ndigital elevation data is an international research effort that\nobtained digital elevation models on a near-global scale. This\nSRTM V3 product (SRTM Plus) is provided by NASA JPL\nat a resolution of 1 arc-second (approximately 30m).</p><p>This dataset has undergone a void-filling process using open-source data\n(ASTER GDEM2, GMTED2010, and NED), as opposed to other versions that\ncontain voids or have been void-filled with commercial sources.\nFor more information on the different versions see the\n<a href="https://lpdaac.usgs.gov/documents/13/SRTM_Quick_Guide.pdf">SRTM Quick Guide</a>.</p><p>Documentation:</p><ul><li><p><a href="https://lpdaac.usgs.gov/documents/179/SRTM_User_Guide_V3.pdf">User's Guide</a></p></li><li><p><a href="https://lpdaac.usgs.gov/documents/13/SRTM_Quick_Guide.pdf">General Documentation</a></p></li><li><p><a href="https://doi.org/10.1029/2005RG000183">Algorithm Theoretical Basis Document (ATBD)</a></p></li></ul><p><b>Provider: <a href="https://cmr.earthdata.nasa.gov/search/concepts/C1000000240-LPDAAC_ECS.html">NASA / USGS / JPL-Caltech</a></b><br><p><b>Bands</b><table class="eecat"><tr><th scope="col">Name</th><th scope="col">Description</th></tr><tr><td>elevation</td><td><p>Elevation</p></td></tr></table><p><b>Terms of Use</b><br><p>Unless otherwise noted, images and video on JPL public\nweb sites (public sites ending with a jpl.nasa.gov address) may\nbe used for any purpose without prior permission. For more information\nand exceptions visit the <a href="https://www.jpl.nasa.gov/imagepolicy/">JPL Image Use Policy site</a>.</p><p><b>Suggested citation(s)</b><ul><li><p>Farr, T.G., Rosen, P.A., Caro, E., Crippen, R., Duren, R., Hensley,\nS., Kobrick, M., Paller, M., Rodriguez, E., Roth, L., Seal, D.,\nShaffer, S., Shimada, J., Umland, J., Werner, M., Oskin, M., Burbank,\nD., and Alsdorf, D.E., 2007, The shuttle radar topography mission:\nReviews of Geophysics, v. 45, no. 2, RG2004, at\n<a href="https://doi.org/10.1029/2005RG000183">https://doi.org/10.1029/2005RG000183</a>.</p></li></ul><style>\n table.eecat {\n border: 1px solid black;\n border-collapse: collapse;\n font-size: 13px;\n }\n table.eecat td, tr, th {\n text-align: left; vertical-align: top;\n border: 1px solid gray; padding: 3px;\n }\n td.nobreak { white-space: nowrap; }\n</style>',
'source_tags': ['nasa', 'usgs'],
'visualization_0_max': '6000.0',
'title': 'NASA SRTM Digital Elevation 30m',
'product_tags': ['srtm', 'elevation', 'topography', 'dem', 'geophysical'],
'provider': 'NASA / USGS / JPL-Caltech',
'visualization_0_min': '0.0',
'visualization_0_name': 'Elevation',
'date_range': [950227200000, 951177600000],
'system:visualization_0_gamma': '1.6',
'period': 0,
'system:visualization_0_bands': 'elevation',
'provider_url': 'https://cmr.earthdata.nasa.gov/search/concepts/C1000000240-LPDAAC_ECS.html',
'visualization_0_gamma': '1.6',
'sample': 'https://mw1.google.com/ges/dd/images/SRTM90_V4_sample.png',
'tags': ['dem',
'elevation',
'geophysical',
'nasa',
'srtm',
'topography',
'usgs'],
'system:visualization_0_max': '6000.0',
'system:visualization_0_name': 'Elevation',
'system:asset_size': 132792638252,
'visualization_0_bands': 'elevation'}}
Now we can access specific properties using dictionary indexing, such as metadata[“bands”], metadata[“properties”], or other relevant keys, depending on the specific metadata we are interested in:
print("Bands information:")
print(metadata["bands"])
print('\n')
print("Properties:")
print(metadata["properties"])Bands information:
[{'id': 'elevation', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': -32768, 'max': 32767}, 'dimensions': [1296001, 417601], 'crs': 'EPSG:4326', 'crs_transform': [0.0002777777777777778, 0, -180.0001388888889, 0, -0.0002777777777777778, 60.00013888888889]}]
Properties:
{'system:visualization_0_min': '0.0', 'type_name': 'Image', 'keywords': ['dem', 'elevation', 'geophysical', 'nasa', 'srtm', 'topography', 'usgs'], 'thumb': 'https://mw1.google.com/ges/dd/images/SRTM90_V4_thumb.png', 'description': '<p>The Shuttle Radar Topography Mission (SRTM, see <a href="https://onlinelibrary.wiley.com/doi/10.1029/2005RG000183/full">Farr\net al. 2007</a>)\ndigital elevation data is an international research effort that\nobtained digital elevation models on a near-global scale. This\nSRTM V3 product (SRTM Plus) is provided by NASA JPL\nat a resolution of 1 arc-second (approximately 30m).</p><p>This dataset has undergone a void-filling process using open-source data\n(ASTER GDEM2, GMTED2010, and NED), as opposed to other versions that\ncontain voids or have been void-filled with commercial sources.\nFor more information on the different versions see the\n<a href="https://lpdaac.usgs.gov/documents/13/SRTM_Quick_Guide.pdf">SRTM Quick Guide</a>.</p><p>Documentation:</p><ul><li><p><a href="https://lpdaac.usgs.gov/documents/179/SRTM_User_Guide_V3.pdf">User's Guide</a></p></li><li><p><a href="https://lpdaac.usgs.gov/documents/13/SRTM_Quick_Guide.pdf">General Documentation</a></p></li><li><p><a href="https://doi.org/10.1029/2005RG000183">Algorithm Theoretical Basis Document (ATBD)</a></p></li></ul><p><b>Provider: <a href="https://cmr.earthdata.nasa.gov/search/concepts/C1000000240-LPDAAC_ECS.html">NASA / USGS / JPL-Caltech</a></b><br><p><b>Bands</b><table class="eecat"><tr><th scope="col">Name</th><th scope="col">Description</th></tr><tr><td>elevation</td><td><p>Elevation</p></td></tr></table><p><b>Terms of Use</b><br><p>Unless otherwise noted, images and video on JPL public\nweb sites (public sites ending with a jpl.nasa.gov address) may\nbe used for any purpose without prior permission. For more information\nand exceptions visit the <a href="https://www.jpl.nasa.gov/imagepolicy/">JPL Image Use Policy site</a>.</p><p><b>Suggested citation(s)</b><ul><li><p>Farr, T.G., Rosen, P.A., Caro, E., Crippen, R., Duren, R., Hensley,\nS., Kobrick, M., Paller, M., Rodriguez, E., Roth, L., Seal, D.,\nShaffer, S., Shimada, J., Umland, J., Werner, M., Oskin, M., Burbank,\nD., and Alsdorf, D.E., 2007, The shuttle radar topography mission:\nReviews of Geophysics, v. 45, no. 2, RG2004, at\n<a href="https://doi.org/10.1029/2005RG000183">https://doi.org/10.1029/2005RG000183</a>.</p></li></ul><style>\n table.eecat {\n border: 1px solid black;\n border-collapse: collapse;\n font-size: 13px;\n }\n table.eecat td, tr, th {\n text-align: left; vertical-align: top;\n border: 1px solid gray; padding: 3px;\n }\n td.nobreak { white-space: nowrap; }\n</style>', 'source_tags': ['nasa', 'usgs'], 'visualization_0_max': '6000.0', 'title': 'NASA SRTM Digital Elevation 30m', 'product_tags': ['srtm', 'elevation', 'topography', 'dem', 'geophysical'], 'provider': 'NASA / USGS / JPL-Caltech', 'visualization_0_min': '0.0', 'visualization_0_name': 'Elevation', 'date_range': [950227200000, 951177600000], 'system:visualization_0_gamma': '1.6', 'period': 0, 'system:visualization_0_bands': 'elevation', 'provider_url': 'https://cmr.earthdata.nasa.gov/search/concepts/C1000000240-LPDAAC_ECS.html', 'visualization_0_gamma': '1.6', 'sample': 'https://mw1.google.com/ges/dd/images/SRTM90_V4_sample.png', 'tags': ['dem', 'elevation', 'geophysical', 'nasa', 'srtm', 'topography', 'usgs'], 'system:visualization_0_max': '6000.0', 'system:visualization_0_name': 'Elevation', 'system:asset_size': 132792638252, 'visualization_0_bands': 'elevation'}
Let’s visualize the ee.Image object stored in srtm using geemap:
# Create a dictionary to define visualization settings
srtm_vis_param= {
'min':0, # Minimum elevation value
'max':6000, # Maximum elevation value
# Color palette for visualization specified as hexadecimal color codes. , I chose these colour using https://colorbrewer2.org/
'palette': ['#ffffe5','#fff7bc','#fee391','#fec44f','#fe9929','#ec7014','#cc4c02','#993404','#662506'],
}
# Create a Map instance from geemap
srtm_vis = geemap.Map()
# Add the ee.Image srtm to the map and specify the visualization parameter
# Set the layer name to "Elevation Data"
srtm_vis.addLayer(srtm, srtm_vis_param, "Elevation Data")
# add LayerControl to the map
srtm_vis.addLayerControl()
# Display the map
srtm_visAs you can see above, we have data for the entire globe. Let’s select data just for Christchurch and surroundings. In the Lab4_data folder, which you downloaded from Learn, you find a geopackage named “study_area.gpkg”. In your JupyterHub, inside your Lab4 folder, create a folder named Lab4_data and then upload ““study_area.gpkg” to the Lab4_data in your Jupyter Hub before attempting the next steps . Go back to Lab 1 section 4.3 if you do not remember how to upload the data to Jupyter Hub .
Run the code below after uploading the data:
import geopandas as gpd
limits= gpd.read_file('Lab4_data/study_area.gpkg')
limits.plot()<Axes: >
We will use the polygon above to create a bounding box to clip the ee.Image(). Let’s first visualize the bounding box:
# The envelope of a geometry is the smallest bounding box that contains the geometry
# The boundary function is applied to the envelope to obtain the boundary geometry,
# which is a LineString representing the outer boundary of the envelope.
limits.envelope.boundary.plot()<Axes: >
We need to translate this bounding box into a ee.feature, which is the format that GEE understands. We start by creating a list with the values for the pairs of coordinates defining our bounding box:
import numpy as np
# Create a list of coordinates by iterating through each vertex of the bounding box boundary
all_coords = [np.dstack(vertex.coords.xy).tolist()[0] for vertex in limits.envelope.boundary.to_crs(4326)]
# Inspect the result
print(all_coords)[[[172.2371989116238, -43.89914795563117], [173.1305615770355, -43.90161932153406], [173.1293044957875, -43.317664419206515], [172.24454253112387, -43.31524264531035], [172.2371989116238, -43.89914795563117]]]
The code above uses a list comprehension to create a list of coordinate lists (x, y pairs) for each vertex in the boundary of the bounding box. It iterates through each vertex in the boundary of the bounding box of the given geometry. Let’s understand each part:
np.dstack(vertex.coords.xy).tolist()[0]:
-vertex represents a vertex (point) in the boundary of the bounding box. -vertex.coords.xy returns a tuple containing the x-coordinates and y-coordinates of the vertex. -np.dstack is used to stack the x-coordinates and y-coordinates into a 2D array, which results in a 3D array (with shape (1, num_coords, 2)). -.tolist() converts the 3D array into a nested list. -[0] is used to extract the first (and only) sublist from the nested list. This sublist contains the coordinates of the vertex as a list.for vertex in limits.envelope.boundary.to_crs(4326):
- .envelope.boundary returns the boundary of the bounding box envelope of the geometry. The boundary is a linear object (LineString) that represents the outer boundary of the envelope. - .to_crs(4326) converts the coordinates of the boundary to the EPSG 4326 coordinate reference system (latitude and longitude).
Now we use this list of coordinates to create a ee.feature:
# Create ee.Geometry feature using the extracted coordinates
# to define a region of interest (roi)
roi = ee.Geometry.Polygon(all_coords)Now we can clip the original data using roi:
# Clip the image to the specified region
srtm_chc = srtm.clip(roi)Let’s also extract the x and y coordinates of the bounding box centroid. We will use them later to center our visualization.
# Calculate the x-coordinate of the centroid
# Transform the bounding box boundary to EPSG 4326 and access the x-coordinate of the centroid
cx = limits.envelope.boundary.to_crs(4326).centroid.x[0]
# Calculate the y-coordinate of the centroid
# Transform the bounding box boundary to EPSG 4326 and access the y-coordinate of the centroid
cy = limits.envelope.boundary.to_crs(4326).centroid.y[0]Now we can visualize our clipped ee.Image stored in the variable srtm_chc:
# Create a dictionary to define visualization settings
srtm_vis_param = {
'min': -5, # Minimum elevation value
'max': 905, # Maximum elevation value
'opacity': 0.5, # Opacity of the visualization (0.0 to 1.0)
# Color palette for visualization specified as hexadecimal color codes.
# I chose these colors using https://colorbrewer2.org/
'palette': ['#ffffe5', '#fff7bc', '#fee391', '#fec44f', '#fe9929', '#ec7014', '#cc4c02', '#993404', '#662506'],
}
# Create a Map instance from geemap with specified center and zoom level
# Set the center of the map to the centroid coordinates
srtm_vis = geemap.Map(center=[cy, cx], zoom=10)
# Load an ee.Image (srtm_chc) and add it to the map with the specified visualization parameters
# Set the layer name to "Elevation Data"
srtm_vis.addLayer(srtm_chc, srtm_vis_param, "Elevation Data")
# Add LayerControl widget to the map for managing layers
srtm_vis.addLayerControl()
# Extract color palette, minimum, and maximum values from the visualization parameter
colors = srtm_vis_param['palette']
vmin = srtm_vis_param['min']
vmax = srtm_vis_param['max']
# Add a colorbar to the map using Branca
srtm_vis.add_colorbar_branca(colors=colors, vmin=vmin, vmax=vmax, layer_name="Elevation Data")
# Display the interactive map
srtm_vis2.3.1.1 Calculating slope and aspect
We can also use ee.Terrain to derive slope and aspect from the srtm_chc DEM. Slope refers to the steepness of the terrain or how much the land surface deviates from being flat. It’s often expressed as a percentage, degree, or ratio. Slope can be calculated by measuring the vertical rise (elevation difference) between two points on the land surface and dividing it by the horizontal distance between those points. Steeper slopes indicate more rapid changes in elevation over a short distance, while gentler slopes represent gradual elevation changes. Aspect is the compass direction toward which a slope faces. It indicates the direction in which a slope is oriented. Aspect is typically measured in degrees from 0° (north) to 360°, moving clockwise. For example, a slope with an aspect of 90° (east) means that the slope faces the east direction. Aspect plays a significant role in understanding how sunlight, wind, and temperature patterns interact with the landscape. It affects factors like vegetation growth, snow accumulation and melt, and habitat distribution.
Let’s start by calculating and visualizing the slope:
# Calculate slope. Units are degrees, range is [0,90].
slope_chc=ee.Terrain.slope(srtm_chc)# Create a dictionary to define visualization settings
slope_vis_param = {
'min': 0, # Minimum slope value
'max': 90, # Maximum slope value
'opacity': 0.5, # Opacity of the visualization (0.0 to 1.0)
# Color palette for visualization specified as hexadecimal color codes.
# I chose these colors using https://colorbrewer2.org/
'palette': ['#ffffe5', '#fff7bc', '#fee391', '#fec44f', '#fe9929', '#ec7014', '#cc4c02', '#993404', '#662506'],
}
# Create a Map instance from geemap with specified center and zoom level
# Set the center of the map to the centroid coordinates
slope_vis = geemap.Map(center=[cy, cx], zoom=10)
# Load an ee.Image (slope_chc) and add it to the map with the specified visualization parameters
# Set the layer name to "Slope in degrees"
slope_vis.addLayer(slope_chc, slope_vis_param, "Slope in degrees")
# Add LayerControl widget to the map for managing layers
slope_vis.addLayerControl()
# Extract color palette, minimum, and maximum values from the visualization parameter
colors = slope_vis_param['palette']
vmin = slope_vis_param['min']
vmax = slope_vis_param['max']
# Add a colorbar to the map using Branca
slope_vis.add_colorbar_branca(colors=colors, vmin=vmin, vmax=vmax, layer_name="Slope in degrees")
# Display the interactive map
slope_visNow, let’s calculate and visualize the aspect:
# Calculate aspect. Units are degrees where 0=N, 90=E, 180=S, 270=W.
aspect_chc=ee.Terrain.aspect(srtm_chc)# Create a dictionary to define visualization settings
aspect_vis_param = {
'min': 0, # Minimum aspect value (compass direction)
'max': 360, # Maximum aspect value (compass direction)
'opacity': 0.5, # Opacity of the visualization (0.0 to 1.0)
# Color palette for visualization specified as color names.
# These colors will be used to represent different aspect directions.
'palette': ['red', 'green', 'blue', 'yellow'],
}
# Create a Map instance from geemap with specified center and zoom level
# Set the center of the map to the centroid coordinates
aspect_vis = geemap.Map(center=[cy, cx], zoom=10)
# Load an ee.Image (aspect_chc) and add it to the map with the specified visualization parameters
# Set the layer name to "Aspect in degrees"
aspect_vis.addLayer(aspect_chc, aspect_vis_param, "Aspect in degrees")
# Add LayerControl widget to the map for managing layers
aspect_vis.addLayerControl()
# Extract color palette, minimum, and maximum values from the visualization parameter
colors = aspect_vis_param['palette']
vmin = aspect_vis_param['min']
vmax = aspect_vis_param['max']
# Add a colorbar to the map using Branca
aspect_vis.add_colorbar_branca(colors=colors, vmin=vmin, vmax=vmax, layer_name="Aspect in degrees")
# Display the interactive map
aspect_vis2.4 Downloading images to Google Drive
Downloading a GEE image or product is useful for performing further analysis, operating on data with a different library, and even exporting to be used in a different software. Let’s learn how to do that by exporting the DEM we retrieved and clipped (srtm_chc). The batch.Export.image.toDrive function from ee is used to export an Earth Engine image to the Google Drive linked to the GEE account.
# Define the export parameters
# Specify the image to be exported (srtm_chc)
image = srtm_chc
# Description for the exported task. This will be the description shown ont the GEE Code Editor
description = 'export_dem'
# Folder in Google Drive where the exported data will be stored. If the folder does nto exist, it will be created
folder = 'Lab4'
# Scale of the exported image (in meters per pixel)
scale = 30
# Coordinate Reference System (CRS) for the exported image (e.g., EPSG:2193)
crs = 'EPSG:2193'
# Initialize an Earth Engine batch export task
# Create an export task using Export.image.toDrive
task = ee.batch.Export.image.toDrive(
image=image,
description=description,
folder=folder,
region=roi,
scale=scale,
crs=crs
)
# Start the export task
task.start()Now, login into the Google Earth Engine Code Editor with your UC gmail account. The following interface will be presented:
On the right-top corner of the GEE Code Editor, go to the “Tasks” tab. In there you see a list of submitted tasks, their status, time running on the server and other details. You should have a task named ‘export_dem’, click on the task name on the GEE Code Editor to show more details. This task will take approximately two minutes to run, once it is finished a grey check mark will appear by the side of the taks name and under the task details you will see the option “Open in Drive”, as shown in the image below:
You can also verify the task status by running and rerunning the code below:
task.status(){'state': 'READY',
'description': 'export_dem',
'creation_timestamp_ms': 1691931024045,
'update_timestamp_ms': 1691931024045,
'start_timestamp_ms': 0,
'task_type': 'EXPORT_IMAGE',
'id': 'HAVVRXXWDQIDQTKLLM2Q77EB',
'name': 'projects/earthengine-legacy/operations/HAVVRXXWDQIDQTKLLM2Q77EB'}
Go to the destination folder where our download was saved by either 1) Clicking on the ‘destination_uris’ link that will appear after task.status()once it returns ‘COMPLETED’; or 2) Clicking on the “Open in Drive” button in the GEE Code Editor. Once you do that, you will be presented with the following page:
Click on the three dots (see red rectangle above) and choose the “Download” option. If you are not asked to select a location to save the file, the file will be saved to the “Downloads” folder. Upload “export_dem.tif” to the Lab4_data in your Jupyter Hub before attempting the next steps . Go back to Lab 1 section 4.3 if you do not remember how to upload the data to Jupyter Hub .
Now, if we want, we can use rioxarray and/or rasterio to read and analyse the data downloaded via GEE. Let’s quickly read the dem data, clip it to the our study area limits and visualize it using ONLY rioxarray and matplotlib.pyplot:
# WRITE YOUR CODE HERE. IF YOU DO NOT REMEMBER HOW TO DO THIS, GO BACK AND HAVE A LOOK AT LAB 3.
# YOUR RESULT SHOULD LOOK LIKE WHAT YOU SEE BELOW WHEN PLOTTED.
# Note that I used cmap ='terrain' and defined vmin and vmax for plotting
# using the vmin and vmax calculated from the clipped image2.5 ee.ImageCollection | Sentinel 2 data
We will retrieve data from the Sentinel-2 MSI: MultiSpectral Instrument, Level-2A collection | GEE Asset ID = “COPERNICUS/S2_SR”. GEE provides various methods to filter, manipulate, and analyze an ee.ImageCollection, allowing you to create composites, perform temporal analyses, and extract specific images based on time or other criteria. We will look into some of these methods, but you should consult the GEE developer guide for more methods.
We will start by defining a date range an location to search and filter the image collection:
# Define start and end dates using ee.Date objects
start_date = ee.Date('2023-01-01')
end_date = ee.Date('2023-01-10')
# Create an ee.DateRange using the start and end dates
date_range = ee.DateRange(start_date, end_date)
# Define a geometry (point) using longitude and latitude coordinates
geom = ee.Geometry.Point(172.617, -43.531)
# You can also define the area of interest using bounding box coordinates as [xmin, ymin, xmax, ymax]
# Uncomment the following line if you want to use a bounding box geometry instead
# geom = ee.Geometry.Rectangle([172.605, -43.540, 172.629, -43.522])Now we use the ee objects date_range and geom to filter the ‘COPERNICUS/S2_SR’ ImageCollection:
# Load and filter the image collection based on dates and location
s2_collection = ee.ImageCollection('COPERNICUS/S2_SR').filterBounds(geom).filterDate(date_range)
# Print the number of images in the filtered collection
collection_size = s2_collection.size()
print('Data type for s2_collection: ', type(s2_collection))
print('Number of Images in Filtered Collection:', collection_size.getInfo())Data type for s2_collection: <class 'ee.imagecollection.ImageCollection'>
Number of Images in Filtered Collection: 2
Our search criteria returned two images, let’s iterate over the collection and print the image IDs:
# Get the size of the Sentinel-2 image collection
count = s2_collection.size()
# Convert the collection to a list of images and iterate over them
# Print the image IDs
image_list = s2_collection.toList(count)
for i in range(count.getInfo()):
# Get the current image from the list
image = ee.Image(image_list.get(i))
# Print the image ID
print("Image ID:", image.id().getInfo())Image ID: 20230103T222541_20230103T222543_T59GPM
Image ID: 20230108T222539_20230108T222541_T59GPM
Now, select one of the images in the collection:
# Use `.get()` method to query the first image
# Convert the collection to a list of images and get the first image (index 0)
s2_img = ee.Image(s2_collection.toList(count).get(0))
# Alternatively, we also can select the first (i.e., earliest) image in the sorted collection
# Sort the collection by 'system:time_start' (acquisition time) in ascending order
# s2_sorted = s2_collection.sort('system:time_start')
# Get the first (earliest) image from the sorted collection
# s2_img = s2_sorted.first()Get the metadata for the selected image:
print(s2_img.getInfo()) # get metadata of the first image, can be slow
print(s2_img.id().getInfo()) # get id of the first image{'type': 'Image', 'bands': [{'id': 'B1', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}, {'id': 'B2', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B3', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B4', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B5', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B6', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B7', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B8', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B8A', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B9', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}, {'id': 'B11', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B12', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'AOT', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'WVP', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'SCL', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'TCI_R', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'TCI_G', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'TCI_B', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'MSK_CLDPRB', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'MSK_SNWPRB', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'QA10', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'QA20', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 4294967295}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'QA60', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}], 'id': 'COPERNICUS/S2_SR/20230103T222541_20230103T222543_T59GPM', 'version': 1672879345200103, 'properties': {'SPACECRAFT_NAME': 'Sentinel-2A', 'SATURATED_DEFECTIVE_PIXEL_PERCENTAGE': 0, 'BOA_ADD_OFFSET_B12': -1000, 'CLOUD_SHADOW_PERCENTAGE': 0.110202, 'system:footprint': {'type': 'LinearRing', 'coordinates': [[172.25433569144138, -44.33448907292303], [172.25435144080205, -44.33448964142126], [173.63031409631594, -44.311082950423106], [173.63037036150135, -44.311045091819324], [173.6304327971241, -44.31101254040063], [173.63043604474908, -44.3109976889258], [173.6086507648804, -43.817310375548736], [173.58741340494214, -43.32357935835102], [173.58736140729522, -43.323539083561485], [173.58731672178305, -43.32349437489322], [173.58729624768324, -43.32349197859048], [172.23381029069893, -43.34611004471465], [172.23375407396483, -43.34614722400039], [172.23369180434426, -43.34617901181267], [172.23368823897152, -43.34619378033401], [172.24382519264705, -43.84032357395031], [172.2542237050964, -44.33440101880432], [172.25427567805582, -44.33444193813843], [172.2543201907514, -44.33448715793944], [172.25433569144138, -44.33448907292303]]}, 'SENSOR_QUALITY': 'PASSED', 'GENERATION_TIME': 1672792852000, 'CLOUDY_PIXEL_OVER_LAND_PERCENTAGE': 50.69924, 'CLOUD_COVERAGE_ASSESSMENT': 46.642885, 'THIN_CIRRUS_PERCENTAGE': 33.942598, 'GRANULE_MEAN_WV': 1.643327, 'BOA_ADD_OFFSET_B1': -1000, 'BOA_ADD_OFFSET_B2': -1000, 'DATASTRIP_ID': 'S2A_OPER_MSI_L2A_DS_2APS_20230104T004052_S20230103T222543_N05.09', 'BOA_ADD_OFFSET_B5': -1000, 'BOA_ADD_OFFSET_B6': -1000, 'BOA_ADD_OFFSET_B3': -1000, 'BOA_ADD_OFFSET_B4': -1000, 'BOA_ADD_OFFSET_B9': -1000, 'BOA_ADD_OFFSET_B7': -1000, 'BOA_ADD_OFFSET_B8': -1000, 'GRANULE_ID': 'L2A_T59GPM_A039351_20230103T222543', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B8': 279.21408931448, 'DATATAKE_TYPE': 'INS-NOBS', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B9': 276.537241460679, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B6': 277.024235679562, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B7': 276.752673977078, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B4': 277.700411170689, 'NOT_VEGETATED_PERCENTAGE': 1.688521, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B5': 277.321221572236, 'RADIOMETRIC_QUALITY': 'PASSED', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B2': 279.971353433579, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B3': 278.597725948063, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B1': 276.704093858901, 'HIGH_PROBA_CLOUDS_PERCENTAGE': 4.07571, 'UNCLASSIFIED_PERCENTAGE': 0.177929, 'OZONE_SOURCE': 'AUX_ECMWFT', 'GRANULE_MEAN_AOT': 0.109682, 'BOA_ADD_OFFSET_B8A': -1000, 'SNOW_ICE_PERCENTAGE': 0.025418, 'BOA_ADD_OFFSET_B11': -1000, 'BOA_ADD_OFFSET_B10': -1000, 'GEOMETRIC_QUALITY': 'PASSED', 'system:asset_size': 1532591790, 'system:index': '20230103T222541_20230103T222543_T59GPM', 'DATATAKE_IDENTIFIER': 'GS2A_20230103T222541_039351_N05.09', 'AOT_RETRIEVAL_ACCURACY': 0, 'AOT_RETRIEVAL_METHOD': 'CAMS', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B8A': 276.493647061935, 'MEAN_SOLAR_AZIMUTH_ANGLE': 60.7237619555518, 'VEGETATION_PERCENTAGE': 9.221157, 'SOLAR_IRRADIANCE_B12': 85.25, 'SOLAR_IRRADIANCE_B10': 367.15, 'SOLAR_IRRADIANCE_B11': 245.59, 'SOLAR_IRRADIANCE_B8A': 955.32, 'FORMAT_CORRECTNESS': 'PASSED', 'system:time_end': 1672784896764, 'WATER_VAPOUR_RETRIEVAL_ACCURACY': 0, 'OZONE_VALUE': 266.58559, 'system:time_start': 1672784896764, 'PROCESSING_BASELINE': '05.09', 'SENSING_ORBIT_NUMBER': 29, 'NODATA_PIXEL_PERCENTAGE': 0, 'SENSING_ORBIT_DIRECTION': 'DESCENDING', 'GENERAL_QUALITY': 'PASSED', 'REFLECTANCE_CONVERSION_CORRECTION': 1.03425237032704, 'MEDIUM_PROBA_CLOUDS_PERCENTAGE': 8.624577, 'MEAN_INCIDENCE_ZENITH_ANGLE_B1': 5.24543349610619, 'MEAN_INCIDENCE_ZENITH_ANGLE_B5': 5.02264667873905, 'MEAN_INCIDENCE_ZENITH_ANGLE_B4': 4.97874598116308, 'MEAN_INCIDENCE_ZENITH_ANGLE_B3': 4.90792023317554, 'MEAN_INCIDENCE_ZENITH_ANGLE_B2': 4.84258221755613, 'MEAN_INCIDENCE_ZENITH_ANGLE_B9': 5.31002472910349, 'MEAN_INCIDENCE_ZENITH_ANGLE_B8': 4.87519026295586, 'MEAN_INCIDENCE_ZENITH_ANGLE_B7': 5.12560449971087, 'DARK_FEATURES_PERCENTAGE': 0.015425, 'MEAN_INCIDENCE_ZENITH_ANGLE_B6': 5.07216724083106, 'MEAN_SOLAR_ZENITH_ANGLE': 33.1359663336527, 'MEAN_INCIDENCE_ZENITH_ANGLE_B8A': 5.18056015760518, 'RADIATIVE_TRANSFER_ACCURACY': 0, 'MGRS_TILE': '59GPM', 'CLOUDY_PIXEL_PERCENTAGE': 46.642885, 'PRODUCT_ID': 'S2A_MSIL2A_20230103T222541_N0509_R029_T59GPM_20230104T004052', 'MEAN_INCIDENCE_ZENITH_ANGLE_B10': 4.94842418199968, 'SOLAR_IRRADIANCE_B9': 812.92, 'DEGRADED_MSI_DATA_PERCENTAGE': 0.0222, 'MEAN_INCIDENCE_ZENITH_ANGLE_B11': 5.05856635514966, 'L2A_QUALITY': 'PASSED', 'MEAN_INCIDENCE_ZENITH_ANGLE_B12': 5.18669139590784, 'SOLAR_IRRADIANCE_B6': 1287.61, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B10': 277.840545306706, 'SOLAR_IRRADIANCE_B5': 1424.64, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B11': 276.999523662303, 'SOLAR_IRRADIANCE_B8': 1041.63, 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B12': 276.418405154167, 'SOLAR_IRRADIANCE_B7': 1162.08, 'SOLAR_IRRADIANCE_B2': 1959.66, 'SOLAR_IRRADIANCE_B1': 1884.69, 'SOLAR_IRRADIANCE_B4': 1512.06, 'SOLAR_IRRADIANCE_B3': 1823.24, 'WATER_PERCENTAGE': 42.118463}}
20230103T222541_20230103T222543_T59GPM
We can get other metadata parameters using dictionary keys:
# Check the available keys
print(s2_img.getInfo().keys()) dict_keys(['type', 'bands', 'id', 'version', 'properties'])
# Use keys to get specific information
print(s2_img.getInfo()['bands']) # get bands information
print('\n')
print('Cloud coverage is: ', s2_img.getInfo()['properties']['CLOUD_COVERAGE_ASSESSMENT']) # get cloud cover [{'id': 'B1', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}, {'id': 'B2', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B3', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B4', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B5', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B6', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B7', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B8', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'B8A', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B9', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}, {'id': 'B11', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'B12', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'AOT', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'WVP', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'SCL', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'TCI_R', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'TCI_G', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'TCI_B', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'MSK_CLDPRB', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'MSK_SNWPRB', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 255}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'QA10', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [10980, 10980], 'crs': 'EPSG:32759', 'crs_transform': [10, 0, 600000, 0, -10, 5200000]}, {'id': 'QA20', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 4294967295}, 'dimensions': [5490, 5490], 'crs': 'EPSG:32759', 'crs_transform': [20, 0, 600000, 0, -20, 5200000]}, {'id': 'QA60', 'data_type': {'type': 'PixelType', 'precision': 'int', 'min': 0, 'max': 65535}, 'dimensions': [1830, 1830], 'crs': 'EPSG:32759', 'crs_transform': [60, 0, 600000, 0, -60, 5200000]}]
Cloud coverage is: 46.642885
With the geemap package, you can also view Earth Engine objects interactively without using the getInfo() function.
geemap.ee_initialize()
s2_collection- type:ImageCollection
- id:COPERNICUS/S2_SR
- version:1691913964466514
- 0:1490659200000
- 1:1647907200000
- description:
Sentinel-2 is a wide-swath, high-resolution, multi-spectral imaging mission supporting Copernicus Land Monitoring studies, including the monitoring of vegetation, soil and water cover, as well as observation of inland waterways and coastal areas.
The Sentinel-2 L2 data are downloaded from scihub. They were computed by running sen2cor. WARNING: ESA did not produce L2 data for all L1 assets, and earlier L2 coverage is not global.
The assets contain 12 UINT16 spectral bands representing SR scaled by 10000 (unlike in L1 data, there is no B10). There are also several more L2-specific bands (see band list for details). See the Sentinel-2 User Handbook for details. In addition, three QA bands are present where one (QA60) is a bitmask band with cloud mask information. For more details, see the full explanation of how cloud masks are computed.
EE asset ids for Sentinel-2 L2 assets have the following format: COPERNICUS/S2_SR/20151128T002653_20151128T102149_T56MNN. Here the first numeric part represents the sensing date and time, the second numeric part represents the product generation date and time, and the final 6-character string is a unique granule identifier indicating its UTM grid reference (see MGRS).
Clouds can be removed by using COPERNICUS/S2_CLOUD_PROBABILITY. See this tutorial explaining how to apply the cloud mask.
For more details on Sentinel-2 radiometric resolution, see this page.
Provider: European Union/ESA/Copernicus
Revisit Interval
5 daysBands
Name Description B1 Aerosols
B2 Blue
B3 Green
B4 Red
B5 Red Edge 1
B6 Red Edge 2
B7 Red Edge 3
B8 NIR
B8A Red Edge 4
B9 Water vapor
B11 SWIR 1
B12 SWIR 2
AOT Aerosol Optical Thickness
WVP Water Vapor Pressure. The height the water would occupy if the vapor were condensed into liquid and spread evenly across the column.
SCL Scene Classification Map (The "No Data" value of 0 is masked out)
TCI_R True Color Image, Red channel
TCI_G True Color Image, Green channel
TCI_B True Color Image, Blue channel
MSK_CLDPRB Cloud Probability Map (missing in some products)
MSK_SNWPRB Snow Probability Map (missing in some products)
QA10 Always empty
QA20 Always empty
QA60 Cloud mask
Bitmask for QA60 - Bit 10: Opaque clouds
- 0: No opaque clouds
- 1: Opaque clouds present
- Bit 11: Cirrus clouds
- 0: No cirrus clouds
- 1: Cirrus clouds present
Image Properties
Name Type Description AOT_RETRIEVAL_ACCURACY DOUBLE Accuracy of Aerosol Optical thickness model
CLOUDY_PIXEL_PERCENTAGE DOUBLE Granule-specific cloudy pixel percentage taken from the original metadata
CLOUD_COVERAGE_ASSESSMENT DOUBLE Cloudy pixel percentage for the whole archive that contains this granule. Taken from the original metadata
CLOUDY_SHADOW_PERCENTAGE DOUBLE Percentage of pixels classified as cloud shadow
DARK_FEATURES_PERCENTAGE DOUBLE Percentage of pixels classified as dark features or shadows
DATASTRIP_ID STRING Unique identifier of the datastrip Product Data Item (PDI)
DATATAKE_IDENTIFIER STRING Uniquely identifies a given Datatake. The ID contains the Sentinel-2 satellite, start date and time, absolute orbit number, and processing baseline.
DATATAKE_TYPE STRING MSI operation mode
DEGRADED_MSI_DATA_PERCENTAGE DOUBLE Percentage of degraded MSI and ancillary data
FORMAT_CORRECTNESS STRING Synthesis of the On-Line Quality Control (OLQC) checks performed at granule (Product_Syntax) and datastrip (Product Syntax and DS_Consistency) levels
GENERAL_QUALITY STRING Synthesis of the OLQC checks performed at the datastrip level (Relative_Orbit_Number)
GENERATION_TIME DOUBLE Product generation time
GEOMETRIC_QUALITY STRING Synthesis of the OLQC checks performed at the datastrip level (Attitude_Quality_Indicator)
GRANULE_ID STRING Unique identifier of the granule PDI (PDI_ID)
HIGH_PROBA_CLOUDS_PERCENTAGE DOUBLE Percentage of pixels classified as high probability clouds
MEAN_INCIDENCE_AZIMUTH_ANGLE_B1 DOUBLE Mean value containing viewing incidence azimuth angle average for band B1 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B2 DOUBLE Mean value containing viewing incidence azimuth angle average for band B2 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B3 DOUBLE Mean value containing viewing incidence azimuth angle average for band B3 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B4 DOUBLE Mean value containing viewing incidence azimuth angle average for band B4 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B5 DOUBLE Mean value containing viewing incidence azimuth angle average for band B5 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B6 DOUBLE Mean value containing viewing incidence azimuth angle average for band B6 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B7 DOUBLE Mean value containing viewing incidence azimuth angle average for band B7 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B8 DOUBLE Mean value containing viewing incidence azimuth angle average for band B8 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B8A DOUBLE Mean value containing viewing incidence azimuth angle average for band B8a and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B9 DOUBLE Mean value containing viewing incidence azimuth angle average for band B9 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B10 DOUBLE Mean value containing viewing incidence azimuth angle average for band B10 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B11 DOUBLE Mean value containing viewing incidence azimuth angle average for band B11 and for all detectors
MEAN_INCIDENCE_AZIMUTH_ANGLE_B12 DOUBLE Mean value containing viewing incidence azimuth angle average for band B12 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B1 DOUBLE Mean value containing viewing incidence zenith angle average for band B1 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B2 DOUBLE Mean value containing viewing incidence zenith angle average for band B2 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B3 DOUBLE Mean value containing viewing incidence zenith angle average for band B3 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B4 DOUBLE Mean value containing viewing incidence zenith angle average for band B4 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B5 DOUBLE Mean value containing viewing incidence zenith angle average for band B5 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B6 DOUBLE Mean value containing viewing incidence zenith angle average for band B6 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B7 DOUBLE Mean value containing viewing incidence zenith angle average for band B7 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B8 DOUBLE Mean value containing viewing incidence zenith angle average for band B8 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B8A DOUBLE Mean value containing viewing incidence zenith angle average for band B8a and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B9 DOUBLE Mean value containing viewing incidence zenith angle average for band B9 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B10 DOUBLE Mean value containing viewing incidence zenith angle average for band B10 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B11 DOUBLE Mean value containing viewing incidence zenith angle average for band B11 and for all detectors
MEAN_INCIDENCE_ZENITH_ANGLE_B12 DOUBLE Mean value containing viewing incidence zenith angle average for band B12 and for all detectors
MEAN_SOLAR_AZIMUTH_ANGLE DOUBLE Mean value containing sun azimuth angle average for all bands and detectors
MEAN_SOLAR_ZENITH_ANGLE DOUBLE Mean value containing sun zenith angle average for all bands and detectors
MEDIUM_PROBA_CLOUDS_PERCENTAGE DOUBLE Percentage of pixels classified as medium probability clouds
MGRS_TILE STRING US-Military Grid Reference System (MGRS) tile
NODATA_PIXEL_PERCENTAGE DOUBLE Percentage of No Data pixels
NOT_VEGETATED_PERCENTAGE DOUBLE Percentage of pixels classified as non-vegetated
PROCESSING_BASELINE STRING Configuration baseline used at the time of the product generation in terms of processor software version and major Ground Image Processing Parameters (GIPP) version
PRODUCT_ID STRING The full id of the original Sentinel-2 product
RADIATIVE_TRANSFER_ACCURACY DOUBLE Accuracy of radiative transfer model
RADIOMETRIC_QUALITY STRING Based on the OLQC reports contained in the Datastrips/QI_DATA with RADIOMETRIC_QUALITY checklist name
REFLECTANCE_CONVERSION_CORRECTION DOUBLE Earth-Sun distance correction factor
SATURATED_DEFECTIVE_PIXEL_PERCENTAGE DOUBLE Percentage of saturated or defective pixels
SENSING_ORBIT_DIRECTION STRING Imaging orbit direction
SENSING_ORBIT_NUMBER DOUBLE Imaging orbit number
SENSOR_QUALITY STRING Synthesis of the OLQC checks performed at granule (Missing_Lines, Corrupted_ISP, and Sensing_Time) and datastrip (Degraded_SAD and Datation_Model) levels
SOLAR_IRRADIANCE_B1 DOUBLE Mean solar exoatmospheric irradiance for band B1
SOLAR_IRRADIANCE_B2 DOUBLE Mean solar exoatmospheric irradiance for band B2
SOLAR_IRRADIANCE_B3 DOUBLE Mean solar exoatmospheric irradiance for band B3
SOLAR_IRRADIANCE_B4 DOUBLE Mean solar exoatmospheric irradiance for band B4
SOLAR_IRRADIANCE_B5 DOUBLE Mean solar exoatmospheric irradiance for band B5
SOLAR_IRRADIANCE_B6 DOUBLE Mean solar exoatmospheric irradiance for band B6
SOLAR_IRRADIANCE_B7 DOUBLE Mean solar exoatmospheric irradiance for band B7
SOLAR_IRRADIANCE_B8 DOUBLE Mean solar exoatmospheric irradiance for band B8
SOLAR_IRRADIANCE_B8A DOUBLE Mean solar exoatmospheric irradiance for band B8a
SOLAR_IRRADIANCE_B9 DOUBLE Mean solar exoatmospheric irradiance for band B9
SOLAR_IRRADIANCE_B10 DOUBLE Mean solar exoatmospheric irradiance for band B10
SOLAR_IRRADIANCE_B11 DOUBLE Mean solar exoatmospheric irradiance for band B11
SOLAR_IRRADIANCE_B12 DOUBLE Mean solar exoatmospheric irradiance for band B12
SNOW_ICE_PERCENTAGE DOUBLE Percentage of pixels classified as snow or ice
SPACECRAFT_NAME STRING Sentinel-2 spacecraft name: Sentinel-2A, Sentinel-2B
THIN_CIRRUS_PERCENTAGE DOUBLE Percentage of pixels classified as thin cirrus clouds
UNCLASSIFIED_PERCENTAGE DOUBLE Percentage of unclassified pixels
VEGETATION_PERCENTAGE DOUBLE Percentage of pixels classified as vegetation
WATER_PERCENTAGE DOUBLE Percentage of pixels classified as water
WATER_VAPOUR_RETRIEVAL_ACCURACY DOUBLE Declared accuracy of the Water Vapor model
Terms of Use
The use of Sentinel data is governed by the Copernicus Sentinel Data Terms and Conditions.
- Bit 10: Opaque clouds
- 0:copernicus
- 1:esa
- 2:eu
- 3:msi
- 4:reflectance
- 5:sentinel
- 6:sr
- period:0
- 0:msi
- 1:sr
- 2:reflectance
- provider:European Union/ESA/Copernicus
- provider_url:https://earth.esa.int/web/sentinel/user-guides/sentinel-2-msi/product-types/level-2a
- sample:https://mw1.google.com/ges/dd/images/COPERNICUS_S2_SR_sample.png
- 0:eu
- 1:esa
- 2:copernicus
- 3:sentinel
- system:visualization_0_bands:B4,B3,B2
- system:visualization_0_max:3000.0
- system:visualization_0_min:0.0
- system:visualization_0_name:RGB
- 0:copernicus
- 1:esa
- 2:eu
- 3:msi
- 4:reflectance
- 5:sentinel
- 6:sr
- thumb:https://mw1.google.com/ges/dd/images/COPERNICUS_S2_SR_thumb.png
- title:Sentinel-2 MSI: MultiSpectral Instrument, Level-2A
- type_name:ImageCollection
- visualization_0_bands:B4,B3,B2
- visualization_0_max:3000.0
- visualization_0_min:0.0
- visualization_0_name:RGB
- type:Image
- id:COPERNICUS/S2_SR/20230103T222541_20230103T222543_T59GPM
- version:1672879345200103
- id:B1
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- id:B2
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B3
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B4
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B5
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B6
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B7
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B8
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B8A
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B9
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- id:B11
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B12
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:AOT
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:WVP
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:SCL
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:TCI_R
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:TCI_G
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:TCI_B
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:MSK_CLDPRB
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:MSK_SNWPRB
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:QA10
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:QA20
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:4294967295
- min:0
- precision:int
- 0:5490
- 1:5490
- id:QA60
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- AOT_RETRIEVAL_ACCURACY:0
- AOT_RETRIEVAL_METHOD:CAMS
- BOA_ADD_OFFSET_B1:-1000
- BOA_ADD_OFFSET_B10:-1000
- BOA_ADD_OFFSET_B11:-1000
- BOA_ADD_OFFSET_B12:-1000
- BOA_ADD_OFFSET_B2:-1000
- BOA_ADD_OFFSET_B3:-1000
- BOA_ADD_OFFSET_B4:-1000
- BOA_ADD_OFFSET_B5:-1000
- BOA_ADD_OFFSET_B6:-1000
- BOA_ADD_OFFSET_B7:-1000
- BOA_ADD_OFFSET_B8:-1000
- BOA_ADD_OFFSET_B8A:-1000
- BOA_ADD_OFFSET_B9:-1000
- CLOUDY_PIXEL_OVER_LAND_PERCENTAGE:50.69924
- CLOUDY_PIXEL_PERCENTAGE:46.642885
- CLOUD_COVERAGE_ASSESSMENT:46.642885
- CLOUD_SHADOW_PERCENTAGE:0.110202
- DARK_FEATURES_PERCENTAGE:0.015425
- DATASTRIP_ID:S2A_OPER_MSI_L2A_DS_2APS_20230104T004052_S20230103T222543_N05.09
- DATATAKE_IDENTIFIER:GS2A_20230103T222541_039351_N05.09
- DATATAKE_TYPE:INS-NOBS
- DEGRADED_MSI_DATA_PERCENTAGE:0.0222
- FORMAT_CORRECTNESS:PASSED
- GENERAL_QUALITY:PASSED
- GENERATION_TIME:1672792852000
- GEOMETRIC_QUALITY:PASSED
- GRANULE_ID:L2A_T59GPM_A039351_20230103T222543
- GRANULE_MEAN_AOT:0.109682
- GRANULE_MEAN_WV:1.643327
- HIGH_PROBA_CLOUDS_PERCENTAGE:4.07571
- L2A_QUALITY:PASSED
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B1:276.704093858901
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B10:277.840545306706
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B11:276.999523662303
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B12:276.418405154167
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B2:279.971353433579
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B3:278.597725948063
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B4:277.700411170689
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B5:277.321221572236
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B6:277.024235679562
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B7:276.752673977078
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B8:279.21408931448
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B8A:276.493647061935
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B9:276.537241460679
- MEAN_INCIDENCE_ZENITH_ANGLE_B1:5.24543349610619
- MEAN_INCIDENCE_ZENITH_ANGLE_B10:4.94842418199968
- MEAN_INCIDENCE_ZENITH_ANGLE_B11:5.05856635514966
- MEAN_INCIDENCE_ZENITH_ANGLE_B12:5.18669139590784
- MEAN_INCIDENCE_ZENITH_ANGLE_B2:4.84258221755613
- MEAN_INCIDENCE_ZENITH_ANGLE_B3:4.90792023317554
- MEAN_INCIDENCE_ZENITH_ANGLE_B4:4.97874598116308
- MEAN_INCIDENCE_ZENITH_ANGLE_B5:5.02264667873905
- MEAN_INCIDENCE_ZENITH_ANGLE_B6:5.07216724083106
- MEAN_INCIDENCE_ZENITH_ANGLE_B7:5.12560449971087
- MEAN_INCIDENCE_ZENITH_ANGLE_B8:4.87519026295586
- MEAN_INCIDENCE_ZENITH_ANGLE_B8A:5.18056015760518
- MEAN_INCIDENCE_ZENITH_ANGLE_B9:5.31002472910349
- MEAN_SOLAR_AZIMUTH_ANGLE:60.7237619555518
- MEAN_SOLAR_ZENITH_ANGLE:33.1359663336527
- MEDIUM_PROBA_CLOUDS_PERCENTAGE:8.624577
- MGRS_TILE:59GPM
- NODATA_PIXEL_PERCENTAGE:0
- NOT_VEGETATED_PERCENTAGE:1.688521
- OZONE_SOURCE:AUX_ECMWFT
- OZONE_VALUE:266.58559
- PROCESSING_BASELINE:05.09
- PRODUCT_ID:S2A_MSIL2A_20230103T222541_N0509_R029_T59GPM_20230104T004052
- RADIATIVE_TRANSFER_ACCURACY:0
- RADIOMETRIC_QUALITY:PASSED
- REFLECTANCE_CONVERSION_CORRECTION:1.03425237032704
- SATURATED_DEFECTIVE_PIXEL_PERCENTAGE:0
- SENSING_ORBIT_DIRECTION:DESCENDING
- SENSING_ORBIT_NUMBER:29
- SENSOR_QUALITY:PASSED
- SNOW_ICE_PERCENTAGE:0.025418
- SOLAR_IRRADIANCE_B1:1884.69
- SOLAR_IRRADIANCE_B10:367.15
- SOLAR_IRRADIANCE_B11:245.59
- SOLAR_IRRADIANCE_B12:85.25
- SOLAR_IRRADIANCE_B2:1959.66
- SOLAR_IRRADIANCE_B3:1823.24
- SOLAR_IRRADIANCE_B4:1512.06
- SOLAR_IRRADIANCE_B5:1424.64
- SOLAR_IRRADIANCE_B6:1287.61
- SOLAR_IRRADIANCE_B7:1162.08
- SOLAR_IRRADIANCE_B8:1041.63
- SOLAR_IRRADIANCE_B8A:955.32
- SOLAR_IRRADIANCE_B9:812.92
- SPACECRAFT_NAME:Sentinel-2A
- THIN_CIRRUS_PERCENTAGE:33.942598
- UNCLASSIFIED_PERCENTAGE:0.177929
- VEGETATION_PERCENTAGE:9.221157
- WATER_PERCENTAGE:42.118463
- WATER_VAPOUR_RETRIEVAL_ACCURACY:0
- system:asset_size:1532591790
- type:LinearRing
- 0:172.25433569144138
- 1:-44.33448907292303
- 0:172.25435144080205
- 1:-44.33448964142126
- 0:173.63031409631594
- 1:-44.311082950423106
- 0:173.63037036150135
- 1:-44.311045091819324
- 0:173.6304327971241
- 1:-44.31101254040063
- 0:173.63043604474908
- 1:-44.3109976889258
- 0:173.6086507648804
- 1:-43.817310375548736
- 0:173.58741340494214
- 1:-43.32357935835102
- 0:173.58736140729522
- 1:-43.323539083561485
- 0:173.58731672178305
- 1:-43.32349437489322
- 0:173.58729624768324
- 1:-43.32349197859048
- 0:172.23381029069893
- 1:-43.34611004471465
- 0:172.23375407396483
- 1:-43.34614722400039
- 0:172.23369180434426
- 1:-43.34617901181267
- 0:172.23368823897152
- 1:-43.34619378033401
- 0:172.24382519264705
- 1:-43.84032357395031
- 0:172.2542237050964
- 1:-44.33440101880432
- 0:172.25427567805582
- 1:-44.33444193813843
- 0:172.2543201907514
- 1:-44.33448715793944
- 0:172.25433569144138
- 1:-44.33448907292303
- system:index:20230103T222541_20230103T222543_T59GPM
- system:time_end:1672784896764
- system:time_start:1672784896764
- type:Image
- id:COPERNICUS/S2_SR/20230108T222539_20230108T222541_T59GPM
- version:1673253812002844
- id:B1
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- id:B2
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B3
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B4
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B5
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B6
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B7
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B8
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:B8A
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B9
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- id:B11
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:B12
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:5490
- 1:5490
- id:AOT
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:WVP
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:SCL
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:TCI_R
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:TCI_G
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:TCI_B
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:10980
- 1:10980
- id:MSK_CLDPRB
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:MSK_SNWPRB
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:255
- min:0
- precision:int
- 0:5490
- 1:5490
- id:QA10
- crs:EPSG:32759
- 0:10
- 1:0
- 2:600000
- 3:0
- 4:-10
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:10980
- 1:10980
- id:QA20
- crs:EPSG:32759
- 0:20
- 1:0
- 2:600000
- 3:0
- 4:-20
- 5:5200000
- type:PixelType
- max:4294967295
- min:0
- precision:int
- 0:5490
- 1:5490
- id:QA60
- crs:EPSG:32759
- 0:60
- 1:0
- 2:600000
- 3:0
- 4:-60
- 5:5200000
- type:PixelType
- max:65535
- min:0
- precision:int
- 0:1830
- 1:1830
- AOT_RETRIEVAL_ACCURACY:0
- AOT_RETRIEVAL_METHOD:SEN2COR_DDV
- BOA_ADD_OFFSET_B1:-1000
- BOA_ADD_OFFSET_B10:-1000
- BOA_ADD_OFFSET_B11:-1000
- BOA_ADD_OFFSET_B12:-1000
- BOA_ADD_OFFSET_B2:-1000
- BOA_ADD_OFFSET_B3:-1000
- BOA_ADD_OFFSET_B4:-1000
- BOA_ADD_OFFSET_B5:-1000
- BOA_ADD_OFFSET_B6:-1000
- BOA_ADD_OFFSET_B7:-1000
- BOA_ADD_OFFSET_B8:-1000
- BOA_ADD_OFFSET_B8A:-1000
- BOA_ADD_OFFSET_B9:-1000
- CLOUDY_PIXEL_OVER_LAND_PERCENTAGE:22.548898
- CLOUDY_PIXEL_PERCENTAGE:36.693248
- CLOUD_COVERAGE_ASSESSMENT:36.693248
- CLOUD_SHADOW_PERCENTAGE:0.302398
- DARK_FEATURES_PERCENTAGE:0.040099
- DATASTRIP_ID:S2B_OPER_MSI_L2A_DS_2BPS_20230108T234153_S20230108T222541_N05.09
- DATATAKE_IDENTIFIER:GS2B_20230108T222539_030514_N05.09
- DATATAKE_TYPE:INS-NOBS
- DEGRADED_MSI_DATA_PERCENTAGE:0.0138
- FORMAT_CORRECTNESS:PASSED
- GENERAL_QUALITY:PASSED
- GENERATION_TIME:1673221313000
- GEOMETRIC_QUALITY:PASSED
- GRANULE_ID:L2A_T59GPM_A030514_20230108T222541
- GRANULE_MEAN_AOT:0.185322
- GRANULE_MEAN_WV:1.012277
- HIGH_PROBA_CLOUDS_PERCENTAGE:27.684695
- L2A_QUALITY:PASSED
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B1:274.184601314582
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B10:275.473049038947
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B11:274.336095563191
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B12:273.792330637319
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B2:277.443988673209
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B3:276.257709319701
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B4:275.056611570466
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B5:275.134639756463
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B6:274.892941644275
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B7:274.627625036428
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B8:276.783722283394
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B8A:274.408683131432
- MEAN_INCIDENCE_AZIMUTH_ANGLE_B9:274.057023763267
- MEAN_INCIDENCE_ZENITH_ANGLE_B1:5.18829874570015
- MEAN_INCIDENCE_ZENITH_ANGLE_B10:4.89782739919536
- MEAN_INCIDENCE_ZENITH_ANGLE_B11:5.00615708048283
- MEAN_INCIDENCE_ZENITH_ANGLE_B12:5.13451239407322
- MEAN_INCIDENCE_ZENITH_ANGLE_B2:4.77481072636394
- MEAN_INCIDENCE_ZENITH_ANGLE_B3:4.84328864187583
- MEAN_INCIDENCE_ZENITH_ANGLE_B4:4.91167893592554
- MEAN_INCIDENCE_ZENITH_ANGLE_B5:4.96757191525675
- MEAN_INCIDENCE_ZENITH_ANGLE_B6:5.018176019417
- MEAN_INCIDENCE_ZENITH_ANGLE_B7:5.07065571494198
- MEAN_INCIDENCE_ZENITH_ANGLE_B8:4.80385657600717
- MEAN_INCIDENCE_ZENITH_ANGLE_B8A:5.12645749868677
- MEAN_INCIDENCE_ZENITH_ANGLE_B9:5.25404698287742
- MEAN_SOLAR_AZIMUTH_ANGLE:60.6678672642428
- MEAN_SOLAR_ZENITH_ANGLE:33.9236576989433
- MEDIUM_PROBA_CLOUDS_PERCENTAGE:8.9908
- MGRS_TILE:59GPM
- NODATA_PIXEL_PERCENTAGE:0
- NOT_VEGETATED_PERCENTAGE:2.756029
- OZONE_SOURCE:AUX_ECMWFT
- OZONE_VALUE:262.438999
- PROCESSING_BASELINE:05.09
- PRODUCT_ID:S2B_MSIL2A_20230108T222539_N0509_R029_T59GPM_20230108T234153
- RADIATIVE_TRANSFER_ACCURACY:0
- RADIOMETRIC_QUALITY:PASSED
- REFLECTANCE_CONVERSION_CORRECTION:1.03430820654262
- SATURATED_DEFECTIVE_PIXEL_PERCENTAGE:0
- SENSING_ORBIT_DIRECTION:DESCENDING
- SENSING_ORBIT_NUMBER:29
- SENSOR_QUALITY:PASSED
- SNOW_ICE_PERCENTAGE:0.006656
- SOLAR_IRRADIANCE_B1:1874.3
- SOLAR_IRRADIANCE_B10:365.41
- SOLAR_IRRADIANCE_B11:247.08
- SOLAR_IRRADIANCE_B12:87.75
- SOLAR_IRRADIANCE_B2:1959.75
- SOLAR_IRRADIANCE_B3:1824.93
- SOLAR_IRRADIANCE_B4:1512.79
- SOLAR_IRRADIANCE_B5:1425.78
- SOLAR_IRRADIANCE_B6:1291.13
- SOLAR_IRRADIANCE_B7:1175.57
- SOLAR_IRRADIANCE_B8:1041.28
- SOLAR_IRRADIANCE_B8A:953.93
- SOLAR_IRRADIANCE_B9:817.58
- SPACECRAFT_NAME:Sentinel-2B
- THIN_CIRRUS_PERCENTAGE:0.017754
- UNCLASSIFIED_PERCENTAGE:0.258032
- VEGETATION_PERCENTAGE:14.245981
- WATER_PERCENTAGE:45.697555
- WATER_VAPOUR_RETRIEVAL_ACCURACY:0
- system:asset_size:1548407990
- type:LinearRing
- 0:172.25433569144138
- 1:-44.33448907292303
- 0:172.25435144080205
- 1:-44.33448964142126
- 0:173.63031409631594
- 1:-44.311082950423106
- 0:173.63037036150135
- 1:-44.311045091819324
- 0:173.6304327971241
- 1:-44.31101254040063
- 0:173.63043604474908
- 1:-44.3109976889258
- 0:173.6086507648804
- 1:-43.817310375548736
- 0:173.58741340494214
- 1:-43.32357935835102
- 0:173.58736140729522
- 1:-43.323539083561485
- 0:173.58731672178305
- 1:-43.32349437489322
- 0:173.58729624768324
- 1:-43.32349197859048
- 0:172.23381029069893
- 1:-43.34611004471465
- 0:172.23375407396483
- 1:-43.34614722400039
- 0:172.23369180434426
- 1:-43.34617901181267
- 0:172.23368823897152
- 1:-43.34619378033401
- 0:172.24382519264705
- 1:-43.84032357395031
- 0:172.2542237050964
- 1:-44.33440101880432
- 0:172.25427567805582
- 1:-44.33444193813843
- 0:172.2543201907514
- 1:-44.33448715793944
- 0:172.25433569144138
- 1:-44.33448907292303
- system:index:20230108T222539_20230108T222541_T59GPM
- system:time_end:1673216894634
- system:time_start:1673216894634
2.6 ee.Image band composite visualisation
We can use geemapto create a band composite visualization of the selected image:
# First we need to scale the image bands to values between 0 and 1.
# The value to devide by comes from the Earth Engine Catalog scale 0.00001 here: https://developers.google.com/earth-engine/datasets/catalog/COPERNICUS_S2_SR#description
# This is often done to rescale the data from 16-bit to floating-point values (0 to 1 range)
s2_img = s2_img.divide(10000)Prepare the center coordinates and the dictionary with visualization parameters:
# Get the x and y coordinates from the geometry and convert them to Python variables
# These coordinates represent the longitude (x) and latitude (y) values of the point geometry
# we used to filter when searching the Image Collection
# Get the x-coordinate (longitude) from the geometry's coordinates and convert to Python variable
cx = geom.coordinates().get(0).getInfo()
# Get the y-coordinate (latitude) from the geometry's coordinates and convert to Python variable
cy = geom.coordinates().get(1).getInfo()
# Define visualization parameters for Sentinel-2 data
# These parameters specify which bands to use for visualization and the data range
# Define the bands for visualization as ['B4', 'B3', 'B2']
# These correspond to the red, green, and blue bands, respectively
# This combination creates a natural color image
# Set the minimum value for visualization to 0
# Set the maximum value for visualization to 1
# This range is used to visualize reflectance values in a normalized range
s2_vis_param = {
'bands': ['B4', 'B3', 'B2'],
'min': 0,
'max': 1
}Create a visualize the image composite in an interactive map:
# Create a Map instance from geemap with a specified center and zoom level
# The map will be centered at the coordinates [cy, cx] and zoomed to level 9
s2_vis = geemap.Map(center=[cy, cx], zoom=9)
# Load an ee.Image (s2_img) and add it to the map with the specified visualization parameters
# The layer will be named "Sentinel R(B4)G(B3)B(B2)"
s2_vis.addLayer(s2_img, s2_vis_param, "Sentinel R(B4)G(B3)B(B2)")
# Add a LayerControl widget to the map for managing layers
s2_vis.addLayerControl()
# Display the interactive map with the added layers and controls
s2_vis2.7 Downloading images to local folder
Note that the maximum size of an export request in Earth Engine is 500 MB (50331648 bytes). For that reason, we will select only three bands and also use a bounding box of a 1km buffer around the geom point to select a smaller part of the image. You can also use geemap to export images to a local folder or extract pixel values as an numpy.array or ndarray. For more details see here
# Create a region of interest (ROI) by buffering the geometry and extracting its bounding box
buffered_geom = geom.buffer(1000)
# Extract the bounding box of the buffered geometry to define the region
# This defines the spatial extent for analysis within the buffered area
region = buffered_geom.bounds()
# Define the list of bands (channels) to export
channels = ['B4', 'B3', 'B2']Select the bands, clip the image using the region defined above and export it:
# Export the clipped image to your local folder
# Clip the Sentinel-2 image (s2_img) to the defined region of interest
# The result is stored in the 'clipped' variable
clipped = s2_img.clip(region)
# Select specific bands (channels) from the clipped image for export
# Use the 'channels' list defined earlier to select the specified bands
selected_channels = clipped.select(channels)
# Export the selected channels as an image to the specified filename
# The image will be exported to the 'Lab4_data' folder in your local directory
geemap.ee_export_image(selected_channels, filename='Lab4_data/S2_img.tif')Generating URL ...
Downloading data from https://earthengine.googleapis.com/v1/projects/earthengine-legacy/thumbnails/273e652d9f82babf5c71c5516fa1db62-28f50aa069e2ebd16ee8ec13df5921b3:getPixels
Please wait ...
Data downloaded to C:\Users\vda14\OneDrive - University of Canterbury\Teaching\2023\GEOG324\Labs\Lab4\Lab4_data\S2_img.tif
We can also export all the images in an image collection:
# Map a clipping function over each image in the Sentinel-2 collection (s2_collection)
# The result is a new collection ('clipped') where each image is clipped to the region
clipped = s2_collection.map(lambda x: x.clip(region))
# Export the entire clipped image collection to the specified output directory
# All images in the 'clipped' collection will be exported to the 'Lab4_data' folder
geemap.ee_export_image_collection(clipped, out_dir='Lab4_data')Total number of images: 2
Exporting 1/2: Lab4_data\20230103T222541_20230103T222543_T59GPM.tif
Generating URL ...
Downloading data from https://earthengine.googleapis.com/v1/projects/earthengine-legacy/thumbnails/1f40cb191303eb8c0bc1df013d587484-c1977336f8e4dc311bd6de46e75d157f:getPixels
Please wait ...
Data downloaded to C:\Users\vda14\OneDrive - University of Canterbury\Teaching\2023\GEOG324\Labs\Lab4\Lab4_data\20230103T222541_20230103T222543_T59GPM.tif
Exporting 2/2: Lab4_data\20230108T222539_20230108T222541_T59GPM.tif
Generating URL ...
Downloading data from https://earthengine.googleapis.com/v1/projects/earthengine-legacy/thumbnails/3714c2b708ab3ec2dea7071f55934a9a-9b5c6ed52d0629e9271ed84033cd90e8:getPixels
Please wait ...
Data downloaded to C:\Users\vda14\OneDrive - University of Canterbury\Teaching\2023\GEOG324\Labs\Lab4\Lab4_data\20230108T222539_20230108T222541_T59GPM.tif
Geemap also allows to extract pixels as a Numpy ndarray using ee_to_numpy():
# Extract Pixels as Numpy ndarray
# Convert the Earth Engine image 's2_img' to a numpy ndarray
# Select specific bands (channels) using the 'channels' list
# The region of interest for extraction is defined by the 'region' variable
img_array = geemap.ee_to_numpy(s2_img.select(channels), region=region)
# Print the shape of the extracted numpy ndarray
# The shape represents the dimensions of the ndarray (rows, columns, bands)
print(img_array.shape)
# Display each band using the 'matplotlib' library
import matplotlib.pyplot as plt
# Create a figure with 'num_bands' subplots arranged in a row
fig, ax = plt.subplots(1, img_array.shape[2], figsize=(5 * img_array.shape[2], 5))
# Loop over the bands and display each one as an image in a separate subplot
for i in range(img_array.shape[2]):
ax[i].imshow(img_array[:, :, i], cmap='gray')
ax[i].set_title(f'Band {i+1}')
# Display the figure
plt.show()(205, 204, 3)
2.8 Band maths: Calculating NDVI
NDVI (Normalized Difference Vegetation Index) is a widely used remote sensing index calculated from satellite imagery. It quantifies vegetation health by comparing the difference between near-infrared and red reflectance, indicating the density and vigor of plant cover. NDVI values range from -1 to +1, with higher values suggesting healthier vegetation. We have calculated it in previous labs using rasterio and rioxarray. Now we will learn how to do it using GEE. First, we define a function that calculates the NDVI:
# Define function to compute NDVI
# This function calculates the Normalized Difference Vegetation Index (NDVI) for an input image.
def compute_ndvi(img):
# Calculate NDVI by normalizing the difference between near-infrared (B8) and red (B4) bands
ndvi = img.normalizedDifference(['B8', 'B4']).rename('ndvi')
# Add the computed NDVI band to the input image
return img.addBands(ndvi)Apply the function to the ee.image s2_img to calculate NDVI:
# Calculate NDVI using the 'compute_ndvi' function on the Sentinel-2 image 's2_img'
# The function computes NDVI and adds the NDVI band to the image
# Select the 'ndvi' band from the computed NDVI image
ndvi = compute_ndvi(s2_img).select(['ndvi'])Prepare the visualization parameters:
# Define visualization parameters for NDVI image
# Specify that the 'ndvi' band should be used for visualization
# Set the minimum value for visualization to -1 and the maximum value to 1
# This range represents the typical NDVI values from -1 to 1, indicating vegetation health.
# Choose a color palette for visualization using hexadecimal color codes
# The palette represents a gradient from negative values (pink) to positive values (green).
# Color palette for visualization specified as hexadecimal color codes.
# I chose these colors using https://colorbrewer2.org/
ndvi_vis_param = {
'bands': ['ndvi'],
'min': -1,
'max': 1,
'palette': ['#d01c8b', '#f1b6da', '#f7f7f7', '#b8e186', '#4dac26']
}Visualize the NDVI using an interactive map:
# Create a map instance ('ndvi_vis') using the geemap library.
# Set the center of the map at the coordinates [cy, cx] and zoom to level 9.
ndvi_vis = geemap.Map(center=[cy, cx], zoom=9)
# Load the NDVI image ('ndvi') and add it to the map with specified visualization parameters.
# The layer will be named "NDVI" and will use the settings defined in 'ndvi_vis_param'.
ndvi_vis.addLayer(ndvi, ndvi_vis_param, "NDVI")
# Add a LayerControl widget to the map for managing layers.
# This widget allows users to toggle layers on and off in the map interface.
ndvi_vis.addLayerControl()
# Extract color palette, minimum, and maximum values from the visualization parameter.
colors = ndvi_vis_param['palette']
vmin = ndvi_vis_param['min']
vmax = ndvi_vis_param['max']
# Add a colorbar to the map using Branca.
# The colorbar visually represents the range of NDVI values using the specified color palette.
ndvi_vis.add_colorbar_branca(colors=colors, vmin=vmin, vmax=vmax, layer_name="NDVI")
# Display the interactive map with the added layers, controls, and colorbar.
ndvi_vis2.8.1 Plot time series
What about plotting NDVI changes over time for the area of interest? In this example, we will be using historical Sentinel 2 images to look at maize crop growth from October (planting) to April (drying).
- Load the area of interest
aoiusing a uploaded shapefile “ncrs.shp”. - compute ndvi images over an image collection using
.map()with thecompute_ndvifunction passed into it. - calculate the median value for each ndvi image within
aoiusingreduceRegions(). - Convert the
FeatureCollectionresulted fromreduceRegions()into a Python list object. - plot time series statistics using
matplotlib.
In the Lab4_data folder, which you downloaded from Learn, you find a shapefile named “ncr.shp”. Upload “ncr.shp” to the Lab4_data in your Jupyter Hub before attempting the next steps . Go back to Lab 1 section 4.3 if you do not remember how to upload the data to Jupyter Hub .
Run the code below after uploading the data:
# Define area of interest using shapefile uploaded
# Convert the shapefile 'Lab4_data/ncrs.shp' to an Earth Engine geometry object.
aoi = geemap.shp_to_ee('Lab4_data/ncrs.shp')
# Define start and end date for image collection
# Define the start and end dates for filtering the Sentinel-2 image collection.
start_date = '2021-10-01'
end_date = '2022-04-01'
# Load the Sentinel-2 surface reflectance image collection ('COPERNICUS/S2_SR').
# Filter the collection to include images within the defined AOI and date range.
# Further filter images with a cloudy pixel percentage less than or equal to 10%.
# Apply the 'compute_ndvi' function to compute NDVI value for each image.
s2_sr = ee.ImageCollection('COPERNICUS/S2_SR') \
.filterBounds(aoi) \
.filterDate(start_date, end_date) \
.filter(ee.Filter.lte('CLOUDY_PIXEL_PERCENTAGE', 10)) \
.map(compute_ndvi) # Compute NDVI value for each image
# Select NDVI band from the processed image collection
s2_ndvi = s2_sr.select(['ndvi'])
# Print the number of images in the collection
print("Number of images in collection: ", s2_ndvi.size().getInfo())
# The code below exports the images to the local disk. Thsi is just for you reference. Do not run it
# as it migth cause storage issues in your JupyterHub
# clipped = s2_ndvi.map(lambda x: x.clip(aoi.toList(s2_ndvi.size()).get(0)))
# geemap.ee_export_image_collection(clipped, out_dir='Lab3_data')Number of images in collection: 10
Now that we have a NDVI image collection with 10 images we can create summary statistics. In Google Earth Engine, “reduce” refers to aggregating (or summarising) data in an image or an image collection over a given region or set of regions. The term “reducer” refers to the type of aggregation function that is applied, such as ee.Reducer.mean(), ee.Reducer.max(), or ee.Reducer.min(), to name a few. Let’s define a function that extracts the NDVI statistics:
# Define a function for reducing region statistics for charting
# This function, named 'extractStats', is used to calculate various statistics
# for an input image within the specified region of interest (AOI).
def extractStats(image):
# Use the 'reduceRegions' function to calculate statistics for the image within the AOI.
return image.reduceRegions(**{
'collection': aoi, # The collection parameter specifies the region of interest (AOI).
'reducer': ee.Reducer.minMax()\
.combine(ee.Reducer.median(), '', True)\
.combine(ee.Reducer.mean(), '', True),
# The 'reducer' parameter defines the combination of reducers to be applied.
# MinMax, median, and mean reducers are combined to calculate various statistics.
# The empty separator '' is used to concatenate the names of the reducers.
# The `True` argument specifies that the output should include both input and result bands.
})* and ** are called the unpacking operators. They are used to unpack iterables like lists, tuples, and dictionaries, so that their elements can be passed as separate arguments to a function.
We wil also define a function to add a date for each NDVI image:
# Define a function for adding date as a band to each image
# This function, named 'addDate', is used to add a new band representing the image date
# to each input image.
def addDate(image):
# Get the date of the image and format it as YYYYMMdd.
img_date = ee.Date(image.date())
img_date = ee.Number.parse(img_date.format('YYYYMMdd'))
# Add the formatted date as a new band named 'date' to the input image.
return image.addBands(ee.Image(img_date).rename('date').toInt())The addDate function uses the date method to get the date of the image and formats it as a YYYYMMdd number using the format method. This formatted date is then added as a new band to the image using the addBands method. The rename method is used to give the new band a name, and toInt is used to convert it to an integer data type. Apply created functions to generate NDVI statistics:
S2NDVI = s2_ndvi.map(addDate).map(extractStats).flatten()
# flatten the resulting feature collection into a single collection of features
# S2NDVI.first().getInfo() # inspect the first record
df_columns = list(S2NDVI.first().getInfo()['properties'].keys())
# Feature collection (mapped) to data frame with reducer
nested_list = S2NDVI.reduceColumns(ee.Reducer.toList(len(df_columns)), df_columns).values().get(0)
# nested_list.getInfo()Now we need to convert Earth Engine data to Pandas DataFrame:
# Import the 'pandas' library for working with DataFrames.
import pandas as pd
# Call the 'getInfo' callback method to retrieve the Earth Engine data as Python objects.
# Create a DataFrame from the retrieved data with specified column names.
# The DataFrame will have columns as per the defined column names.
df = pd.DataFrame(nested_list.getInfo(), columns=df_columns)
# Process and organize the DataFrame
# Convert the 'date_min' column to a Pandas datetime format ('YYYYMMdd' to datetime).
# Sort the DataFrame by the 'date' column to organize the data chronologically.
df['date'] = pd.to_datetime(df['date_min'], format='%Y%m%d')
df_new = df.sort_values(by=['date'])
# Print the organized DataFrame.
df_new| Id | date_max | date_mean | date_median | date_min | ndvi_max | ndvi_mean | ndvi_median | ndvi_min | date | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 20211015 | 20211015 | 20211015 | 20211015 | 0.566740 | 0.482887 | 0.478913 | 0.426559 | 2021-10-15 |
| 1 | 0 | 20211106 | 20211106 | 20211106 | 20211106 | 0.250150 | 0.206848 | 0.205298 | 0.186848 | 2021-11-06 |
| 2 | 0 | 20211224 | 20211224 | 20211224 | 20211224 | 0.935264 | 0.924427 | 0.925226 | 0.905646 | 2021-12-24 |
| 3 | 0 | 20211231 | 20211231 | 20211231 | 20211231 | 0.937535 | 0.925854 | 0.925244 | 0.912918 | 2021-12-31 |
| 4 | 0 | 20220103 | 20220103 | 20220103 | 20220103 | 0.943231 | 0.932584 | 0.932240 | 0.924317 | 2022-01-03 |
| 5 | 0 | 20220115 | 20220115 | 20220115 | 20220115 | 0.800279 | 0.746826 | 0.749923 | 0.649072 | 2022-01-15 |
| 6 | 0 | 20220214 | 20220214 | 20220214 | 20220214 | 0.593935 | 0.557921 | 0.568185 | 0.303103 | 2022-02-14 |
| 7 | 0 | 20220224 | 20220224 | 20220224 | 20220224 | 0.205488 | 0.150877 | 0.144343 | 0.129940 | 2022-02-24 |
| 8 | 0 | 20220314 | 20220314 | 20220314 | 20220314 | 0.213159 | 0.146826 | 0.139136 | 0.126539 | 2022-03-14 |
| 9 | 0 | 20220316 | 20220316 | 20220316 | 20220316 | 0.185852 | 0.145046 | 0.141210 | 0.132197 | 2022-03-16 |
##How to handle datetime format in python?
In Python, we use the strptime() method of the datetime module to parse a string in the format ‘yyyy-mm-dd hh:mm:ss’ and convert it to a datetime object. The second argument to strptime() is a format string that describes the structure of the input string. For example, try datetime.datetime.strptime('2023-05-12 13:30:00', '%Y-%m-%d %H:%M:%S'). We can use the strftime() method of the datetime module to format a datetime object as a string. The argument to strftime() is a format string that describes the desired output format. For example, try datetime.datetime(2023, 5, 12, 13, 30).strftime('%d/%m/%Y').
Similarly, you can convert a string to a datetime object using the to_datetime() method of the Pandas library to parse a string. For example, try pd.to_datetime('2023-05-12 13:30:00'). We use the format argument of the to_datetime() method to specify the format of the input string. The format string follows the same conventions as the strftime() method of the datetime module.
Now we can use the dataframe to plot all the NDVI statistics using matplotlib:
# Create a figure and axis for the plot.
fig, ax = plt.subplots(figsize=(9, 3))
# Plot the 'ndvi_min', 'ndvi_median', and 'ndvi_max' columns against the 'date' column.
# Each statistic is plotted as a series of points with lines ('o-') in different colors.
# The colors 'r' (red), 'g' (green), and 'b' (blue) represent 'ndvi_min', 'ndvi_median', and 'ndvi_max' respectively.
# Legend labels are provided for each series.
ax.plot(df['date'], df['ndvi_min'], 'o-', color='r', label="min")
ax.plot(df['date'], df['ndvi_median'], 'o-', color='g', label="median")
ax.plot(df['date'], df['ndvi_max'], 'o-', color='b', label="max")
# Add a legend to the plot indicating the meaning of each series.
plt.legend(loc="best")
# Set labels for the x-axis and y-axis.
ax.set_xlabel('Date')
ax.set_ylabel('NDVI')
# Display the plot.
plt.show()The Microsoft Planetary Computer API is another platform (launched in 2020) that provides access to large geospatial datasets and tools for analysis. And it has quickly gained attention and popularity in the geospatial data science community.
GEE is specifically focused on remote sensing data and provides a wide range of tools for processing and analyzing that type of data. On the other hand, the Microsoft Planetary Computer API provides access to a broader range of geospatial data, including remote sensing data, climate data, and social and economic data. It also includes tools for analysis and machine learning, as well as access to cloud-based computing resources.
The Planetary Computer API is built on top of Microsoft Azure and leverages a range of Azure services such as Azure Blob Storage, Azure Machine Learning, and Azure Kubernetes Service. It offers a scalable and reliable infrastructure for working with large and complex geospatial datasets.
The Planetary Computer API is available to researchers, developers, and organizations who are working on projects related to environmental sustainability. It is free to use, but users must apply for access and be approved by Microsoft. The API provides a powerful tool for analyzing geospatial data and developing solutions to some of the world’s most pressing environmental challenges.
More info on https://planetarycomputer.microsoft.com/docs/quickstarts/reading-stac/
Optionally, you can check out their tutorials on https://github.com/microsoft/PlanetaryComputerExamples/tree/main/tutorials
3 OpenStreetMap (OSM) API | OSMnx
OpenStreetMap (OSM) is a collaborative project to create a free and editable map of the world.OSMnx is a Python library that allows you to interact with and analyze OpenStreetMap (OSM) data. It provides tools for downloading, modeling, analyzing, and visualizing street networks and other geographical features. OSMnx simplifies the process of working with OSM data and can be used for various tasks related to urban planning, transportation analysis, and geographical research. You can find more information about OSMnx here.
Here are some key features and functionalities of OSMnx:
Data Retrieval: OSMnx allows you to download OSM data for specific geographical areas by specifying coordinates, addresses, place names, or polygons. It retrieves data from the OSM database and provides options to filter data by different attributes.
Network Modeling: The library can automatically construct complex street networks from the downloaded OSM data. It can represent both pedestrian and vehicular networks, including one-way streets, speed limits, and more.
Graph Analysis: OSMnx creates a network graph where nodes represent intersections and endpoints, and edges represent streets. You can perform various graph analysis operations such as finding shortest paths, calculating network statistics, and more.
Geographical Visualization: OSMnx provides tools for visualizing street networks, nodes, edges, and other geographical features. It supports visualization using Matplotlib and can display graphs with different attributes and styles.
Routing: OSMnx can calculate shortest paths and routes for both pedestrians and vehicles within the constructed network. It uses graph algorithms to find optimal paths between nodes.
Urban Form Analysis: The library supports urban form analysis by computing metrics like street segment lengths, connectivity, and block statistics. These metrics are useful for understanding urban layout and design.
Multi-modal Networks: OSMnx allows you to combine multiple modes of transportation, such as pedestrians, cyclists, and vehicles, into a single network representation.
Customization: You can customize various aspects of the generated plots, including colors, styles, labels, and more.
3.0.1 Extracting road network
Getting a road network from OSMnx is one of the core functionalities of the library. OSMnx provides convenient functions to retrieve, analyze, and visualize road networks based on OpenStreetMap data.The graph_from_place function retrieves a street network graph for a specified place using OpenStreetMap data. You provide a place name or boundary, and OSMnx fetches the road network graph for that location. Let’s get the street network for the University of Canterbury:
# Import the OSMnx library
import osmnx as ox
# Define the location for which we want to retrieve OSM information
place = "University of Canterbury"
# Extract the street network graph based on the specified place name
network = ox.graph_from_place(place)It is also possible to extract the street network graph based on bounding box coordinates:.
# Define the bouding box coordinates. These coordinates are in the epsg:4326
north = -43.5199121
south = -43.528178
east = 172.5883328
west = 172.5728817
# extract the street network graph based on UC bounding box coordinates
network = ox.graph_from_bbox(north, south, east, west, network_type='all')Check the type of the network variable:
type(network)networkx.classes.multidigraph.MultiDiGraph
networkx.classes.multidigraph.MultiDiGraph is a class provided by the NetworkX library, which is used for creating and manipulating directed graphs with multiple edges and self-loops. In the context of street networks and transportation systems, a MultiDiGraph can be used to represent complex relationships between nodes (intersections) and edges (streets or roads). Visualize the network object:
# Plot the street network graph and get the figure and axis objects
fig, ax = ox.plot_graph(network)These are the nodes (intersections) and edges (street segments) within the UC bounding box. We can separate nodes and edges to create GeoDataFrames:
# Convert the graph's nodes and edges into GeoDataFrames
nodes, edges = ox.graph_to_gdfs(network)Visualize road intersections:
nodes.plot()<Axes: >
Visualize road edges:
edges.plot()<Axes: >
3.0.2 Geocoding
Geocoding is the process of converting a textual description of a location, such as a place name or address, into geographical coordinates (latitude and longitude) on the Earth’s surface. OSMnx offers the function geocode_to_gdf for that. Let’s use it to geocode some adresses in Christchurch:
from shapely.geometry import Point
# List of addresses to geocode
address_list = [
"11A Montana Avenue, Christchurch, New Zealand",
"9 Zenith Place, Christchurch, New Zealand",
"611 Cashel Street, Christchurch, New Zealand"
]
geom = [] # List to store Point geometries
# Loop through the address list and geocode each address
for address in address_list:
geocode_result = ox.geocode(address)
if geocode_result is not None: # Check if geocoding found a result
y, x = geocode_result
point = Point(x, y) # Create a Point geometry
geom.append(point) # Add the Point to the list
# Create GeoDataFrame with a dictionary of attributes and a list of geometries
gdf = gpd.GeoDataFrame({'Address': address_list}, geometry=geom, crs=4326)
# Print the GeoDataFrame
print(gdf) Address geometry
0 11A Montana Avenue, Christchurch, New Zealand POINT (172.58158 -43.52062)
1 9 Zenith Place, Christchurch, New Zealand POINT (172.53609 -43.53042)
2 611 Cashel Street, Christchurch, New Zealand POINT (172.65395 -43.53310)
Plot the result:
gdf.plot()<Axes: >
3.0.3 Routing
Routing is the process of determining the optimal path or route between two or more locations in a network, such as a transportation or communication network. The goal of routing is to find the most efficient way to navigate from a starting point (origin) to a destination point (destination) while considering various factors, such as distance, time, cost, or specific constraints. NetworkX is a Python library used for the creation, manipulation, and analysis of complex networks or graphs. It provides tools to work with various types of graphs, including directed, undirected, weighted, and multigraphs.
Let’s find the shortest path between Beatrice Tinsley and the Campus Security using networkx:
import networkx as nx
# Define the start and end points for routing
start_point = (-43.522804, 172.583258) # Latitude and longitude for BT
end_point = (-43.521537568382065, 172.57975565615905) # Latitude and longitude for security
# Find the nearest nodes to the start and end points
start_node = ox.distance.nearest_nodes(network, start_point[1], start_point[0])
end_node = ox.distance.nearest_nodes(network, end_point[1], end_point[0])
# Find the shortest path using networkx
route = nx.shortest_path(network, source=start_node, target=end_node, weight='length')
# Plot the route on the street network
ox.plot_graph_route(network, route, route_linewidth=6, node_size=0, bgcolor='k')(<Figure size 800x800 with 1 Axes>, <Axes: >)
We can transform the obtained route into a LineString and add it to a GeoDataFrame:
from shapely.geometry import LineString
# Convert the route to a list of (latitude, longitude) coordinates
route_coordinates = [(network.nodes[node]['x'],network.nodes[node]['y']) for node in route]
# Convert the list of coordinates into a LineString geometry
route_line = LineString(route_coordinates)
# Create a GeoDataFrame with the LineString geometry
route_gdf = gpd.GeoDataFrame({'geometry': [route_line]}, crs=4326)
# Inspect the result
route_gdf.plot()<Axes: >
3.1 Extracting area boundaries
We can use ‘OSMnx’ to extract into a GeoDataFrame a polygon representing a neighborhood, city, area or region. Let’s extract the polygon representing the UC boundaries:
# Geocode the place to a GeoDataFrame with the boundary
area = ox.geocode_to_gdf(place)
# Print the type of the 'area' variable
print(type(area))
# Plot the polygon of the area
area.plot()<class 'geopandas.geodataframe.GeoDataFrame'>
<Axes: >
3.3 Exporting data from OSMnx
We can export OSM data to a shapefile or geopackage once we transform it in a GeoDataFrame. Let’s check the GeoDataFrames containing the UC buildings:
buildings.head(2)| geometry | wheelchair | access | amenity | nodes | building | description | name | building:levels | building:levels:underground | ... | min_level | building:min_level | landuse | ref:linz:building_id | shelter_type | ways | type | max_level | alt_name | healthcare | ||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| element_type | osmid | |||||||||||||||||||||
| way | 32162596 | POLYGON ((172.58216 -43.52121, 172.58222 -43.5... | NaN | NaN | NaN | [361008120, 7697590493, 9164406433, 9164406432... | yes | The School of Engineering was the second area ... | School of Engineering | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 34365972 | POLYGON ((172.58267 -43.52176, 172.58226 -43.5... | NaN | NaN | NaN | [394348883, 6997440534, 6997440535, 1082401996... | university | NaN | West | 8 | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
2 rows × 62 columns
We can check all building types by applying pd.unique to the column building:
import pandas as pd
pd.unique(buildings.building)array(['yes', 'university', 'school', 'warehouse', 'retail', 'dormitory',
'church', 'pavilion', 'industrial', 'house', 'service', 'shed',
'roof', 'no', 'detached', 'semidetached_house', 'kindergarten'],
dtype=object)
Check all columns available on the GeoDataFrame building:
buildings.columnsIndex(['geometry', 'wheelchair', 'access', 'amenity', 'nodes', 'building',
'description', 'name', 'building:levels', 'building:levels:underground',
'building:part', 'addr:city', 'addr:country', 'addr:postcode',
'addr:street', 'operator', 'addr:housenumber', 'roof:levels',
'roof:shape', 'image', 'website', 'nohousenumber', 'source',
'addr:place', 'email', 'opening_hours', 'payment:credit_cards',
'payment:debit_cards', 'phone', 'shop', 'office', 'religion',
'addr:suburb', 'ref:linz:address_id', 'heritage:website', 'historic',
'start_date', 'wikidata', 'club', 'internet_access', 'wikipedia',
'name:mi', 'alt_name:en', 'bicycle_parking', 'old_name', 'covered',
'fee', 'locked', 'operator:type', 'layer', 'place_of_worship', 'fax',
'min_level', 'building:min_level', 'landuse', 'ref:linz:building_id',
'shelter_type', 'ways', 'type', 'max_level', 'alt_name', 'healthcare'],
dtype='object')
Verify number of columns and rows:
# The number of rows indicates the number of buildings
# 194 in the UC area
buildings.shape(194, 62)
Select the geometry and names columns into a new GeoDataFrame:
buildings_sel=buildings[["geometry","name"]]
buildings_sel.head(2)| geometry | name | ||
|---|---|---|---|
| element_type | osmid | ||
| way | 32162596 | POLYGON ((172.58216 -43.52121, 172.58222 -43.5... | School of Engineering |
| 34365972 | POLYGON ((172.58267 -43.52176, 172.58226 -43.5... | West |
Export selection to a shapefile:
buildings_sel.to_file("Lab4_data/buildings.shp")To finish in Pythonic style, let’s visualize in a single figure all the OSM data we downloaded:
# Create a figure and an axis for plotting, and set the figure size
fig, eixo = plt.subplots(figsize=(10, 10))
# Plot each of the GeoDataFrames on the plotting axis to ensure they are all on the same figure
# Plot the 'area' GeoDataFrame with black face color
area.plot(ax=eixo, facecolor="black")
# Plot the 'arestas' GeoDataFrame with specified line width and edge color
edges.plot(ax=eixo, linewidth=1, edgecolor="#BC8F8F")
# Plot the 'buildings' GeoDataFrame with khaki face color and adjusted transparency
buildings.plot(ax=eixo, facecolor="khaki", alpha=0.7)
# Plot the 'trees' GeoDataFrame with green face color
trees.plot(ax=eixo, facecolor="green")<Axes: >
4 Christchurch City Council Spatial Open Data Portal API
Most territorial authorities in New Zealand offer public access to their datasets such as council assets, infrastructure, planning rules, natural and cultural heritage. For example, the Christchurch City Council maintains authoritative spatial datasets that can be accessed through their Spatial Open Data Portal. See https://opendata-christchurchcity.hub.arcgis.com/.
The requests library is a popular Python library used for making HTTP requests to interact with web services and APIs. It simplifies the process of sending HTTP requests and handling responses, making it easier to retrieve data from web servers, submit forms, and perform other web-related tasks. The requests library is widely used in web scraping, API integration, and other web-related programming tasks.
As an example, let’s pull the manhole data from their API using the requests library.
# Import the necessary libraries
import requests
import json
import geopandas as gpd
# Define the API endpoint URL
endpoint = 'https://gis.ccc.govt.nz/server/rest/services/OpenData/StormWater/FeatureServer/16/query?outFields=*&where=1%3D1&f=geojson'
# Send a GET request to the API endpoint and retrieve the JSON response
response = requests.get(endpoint)
# Parse the JSON response into a dictionary
data = response.json()
# Extract the GeoJSON features from the dictionary
geojson_data = data['features']
# Create a GeoDataFrame from the GeoJSON features with the specified CRS
gdf = gpd.GeoDataFrame.from_features(geojson_data, crs=4326) # WGS84
# Inspect the result
gdf.head(5)| geometry | SwStructureID | Type | ServiceStatus | Ownership | Responsibility | Maintenance | LocationCertainty | InstallationCompany | Construction | ... | CommissionDate | DecommissionDate | Comment | Project | ContractorExternalReference | SAPInternalReference | CreateDate | LastEditDate | Shape__Area | Shape__Length | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | POLYGON ((172.70845 -43.48232, 172.70845 -43.4... | 1 | Manhole | In Service | CCC | Land Drainage Operations | Land Drainage Operations | Accurate | None | None | ... | NaN | NaN | None | None | None | IE000000000010307460 | NaN | 1363917144000 | 4.805594 | 8.768860 |
| 1 | POLYGON ((172.71018 -43.48229, 172.71018 -43.4... | 2 | Manhole | In Service | CCC | Land Drainage Operations | Land Drainage Operations | Accurate | None | None | ... | NaN | NaN | None | None | None | IE000000000010307748 | NaN | 1363917352000 | 4.841305 | 8.801498 |
| 2 | POLYGON ((172.71176 -43.48227, 172.71176 -43.4... | 3 | Manhole | In Service | CCC | Land Drainage Operations | Land Drainage Operations | Accurate | None | None | ... | NaN | NaN | None | None | None | IE000000000010308026 | NaN | 1590707953000 | 6.535316 | 9.890124 |
| 3 | POLYGON ((172.71170 -43.48146, 172.71170 -43.4... | 4 | Manhole | In Service | CCC | Land Drainage Operations | Land Drainage Operations | Accurate | None | None | ... | NaN | NaN | None | None | None | IE000000000010308296 | NaN | 1590707953000 | 2.321603 | 6.133053 |
| 4 | POLYGON ((172.71252 -43.48226, 172.71252 -43.4... | 5 | Manhole | In Service | CCC | Land Drainage Operations | Land Drainage Operations | Accurate | None | None | ... | NaN | NaN | None | None | None | IE000000000010308555 | NaN | 1590707953000 | 3.031829 | 7.086492 |
5 rows × 21 columns
Export the data to a GeoDataFrame:
gdf.to_file('Lab4_data/ccc_sw.gpkg')5 Final Pythonic note
🎉 Congratulations on finishing this lab! 🎉 Now you have new data sources to work with in your GISC412|GEOG324 research project and beyond! 🌍 This was just an introduction to the vast world of APIs, and there are many more out there for you to explore! 🚀
Best regards, Dr. Vanessa Bastos 👩🏫